var/home/core/zuul-output/0000755000175000017500000000000015156464437014544 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015156505321015474 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000422126115156505132020261 0ustar corecoreZikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gfͅ2|YEڤ펯_ˎ6_o#oVݏKf핷ox[o8W5B% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ_oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿/h QZ*U1|t5wKOؾ{mk b2 ܨ;RJK!b>JR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߿)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[<~6]90}(*T7siv'=k 9Q2@vN ( R['>v*;o57sp$3ncx!>t®W>]tF-iܪ%GYbaRvHa}dkD̶*';ک|s_}8yj,('GrgTZ'U鋊TqOſ * /Ijo!՟8`"j}zӲ$k3jS|C7;A)͎V.r?t\WU1ojjr<~Tq> `=tJ!aݡ=h6Yݭw}?lѹ`f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞJ|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEmDφǽ{QiOENG{P;sHz"G- >+`قSᔙD'Ad ѭj( ہO r:91v|ɛr|٦/o{C Ӹ!uWȳ)gjw&+uߕt*:͵UMQrN@fYDtEYZb4-UCqK٪L.2teB ˛"ո{Gci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_YҊŏ#_f^ TD=VAKNl4Kš4GScѦa0 J ()¾5m'p/\խX\=z,Mw˭x:qu礛WԓL!I? xӤ1(5AKRVF2ɌУլ F "vuhc=JS\kkZAY`R"Hr1]%oR[^oI]${&L8<=#0yaKL: JJl r;t#H+B|ɧJiM cm)>H=l}.^\ݧM<lu Y> XH\z:dHElL(uHR0i#q%]!=t_쾋-, vW~*ſ/,e?IsoSrm_7dPΣ|ͣn/𚃚p9w#z A7yTJ$KOL-aP+;;%+_6'Sr|@2nQ{aK|bjܒ^o(מO80$QxBcXE ء\G=~j{Mܚ: hLT!uP_T{G7C]Ch',ެJG~Jc{xt zܳ'鮱iX%x/QOݸ}S^vv^2M!.xR0I(P 'fΑQ)ۢWP Pe>F=>l |fͨ3|'_iMcĚIdo阊;md^6%rd9#_v2:Y`&US tDkQ;>" ء:9_))wF|;~(XA PLjy*#etĨB$"xㄡʪMc~)j 1駭~բ>XiN .U轋RQ'Vt3,F3,#Y3,kJ3,LhVnKauomˠ_>2h-/ ђ(9Uq EmFjq1jX]DןR24d 3[n )ܗKj/jUSsȕD $([LH%xa1yrO.|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI>QQ!'7h,sF\jzP\7:Q\)#s{p'ɂN$r;fVkv߸>6!<̅:xn<# -BȢ1I~ŋ-*|`В~_>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!sRE2K0p\0͙npV)̍F$X8a-bp)5,] Bo|ؖA]Y`-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘ3ڴ(|e@ew>w3C=9k-{p>րd^T@eFZ#WWwYzK uK r؛6V L)auS6=`#(TO֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[klzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IX/C7@u BGڔE7M/k $q^hڧ};naU%~X!^C5Aw͢.@d!@dU}b? -ʏw |VvlK۴ymkiK% 0OFjT_kPW1mk%?\@R>XCl}b ,8; :.b9m]XaINE`!6uOhUuta^xN@˭d- T5 $4ذ:[a>֋&"_ }Oõϸ~rj uw\h~M il[ 2pCaOok.X0C?~[:^Pr򣏷y@/ڠ --i!M5mjozEƨ||Yt,=d#uЇ  l]չoݴmqV".lCqBѷ /![auPmpnEjus]2{2#b'$?T3{k>h+@]*pp桸]%nĴFԨlu |VXnq#r:kg_Q1,MNi˰ 7#`VCpᇽmpM+tWuk0 q /} 5 ¶]fXEj@5JcU_b@JS`wYmJ gEk2'0/> unKs^C6B WEt7M'#|kf1:X l]ABC {kanW{ 6 g`_w\|8Fjȡstuf%Plx3E#zmxfU S^ 3_`wRY}@ŹBz²?mК/mm}m"Gy4dl\)cb<>O0BďJrDd\TDFMEr~q#i}$y3.*j) qQa% |`bEۈ8S 95JͩA3SX~߃ʟ~㍖›f!OI1R~-6͘!?/Vvot4~6I@GNݖ-m[d<-l9fbn,'eO2sٟ+AWzw A<4 }w"*mj8{ P&Y#ErwHhL2cPr Wҭюky7aXt?2 'so fnHXx1o@0TmBLi0lhѦ* _9[3L`I,|J @xS}NEij]Qexx*lJF#+L@-ՑQz֬]")JC])"K{v@`<ۃ7|qk" L+Y*Ha)j~pu7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݻ<̍8)r`F!Woc0Xq0 R' eQ&Aѣzvw=e&".awfShWjÅD0JkBh]s9Ą|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɧZ#"_gi(`0/ƍcv7go2G$ N%v$^^&Q 4AMbvvɀ1J{ڔhэK'9*W )IYO;U4z⛢79"hK{BFEmBAΛ3>IO j u?d{=t-n3Pnef9[}=%G*9sX,¬xS&9'E&"/"ncx}"mV5tŘ:wcZ К G)]$mbXE ^ǽ8%>,0FЕ 6vAVKVCjrD25#Lrv?33Iam:xy`|Q'eű^o\ơ' .gygSA8l_]lW6xՉLā(^z=>%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00?օYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}ZE$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P_fub8P銗KDi'U6K׉5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs ={F۴'c,QAIٰ9JX/z);B= @%AIt0v[&FJE͙A~IQ%iShnMІt.޿>y8P_共lw1ù=LAЦz38ckʖYz ~kQRL Q rGQ/ȆMC)vg1Xa!&'0Dp\~^=7jv "8O AfI; P|ޓܜ 8qܦzl5tw@,Mڴg$%82h7oaz32h>`XT>%)pQ}Tgĸ6Coɲ=8f`KݜȆqDDbZ:B#O^?tNG\Q.pPO @:Sg9dTcxRk&%])ў}VLN]Nbjgg`d]LGϸ.yҵUCL(us6*>B 2K^ sBciۨutl:J;quӋkKϮ듃ԁ6Y.0O۾'8V%1M@)uIw].5km~Ҷ痶R(mtV3rșjmjJItHڒz>6nOf5~IJ|~!yKڮ2 h ob9%islԃ)Hc`ebw|Ī Zg0FRYeO:F)O>UD;;]Y,2ڨi"R"*R2s@AK/u5,j#u>cY^*xkJ7C~pۊ ~;ɰ@oՙ.rTm0:;}d8 ݨW>.[Vhi/̒;̥8$W!p.?:mM;ߵo{+~e3݌ƺ(ܸf)*gCQE*pp^~x܃`U'A~E90t~8-2S󥞙nk56s&"mgVKA: X>7QQ-CDC'| #]Y1E-$nP4N0#C'dvܸo˦.vIH"ŐR ;@~y>Kv{) 9AG ćͩ$.!б~N8i"1KФ\L7/,U@.ڮbIG]0\mȐb#;|yɚ YZgދ8H KV,XHS~^No=Yʪ)e+LZlmȱ 9"xiwH G&v_~|]a(YE:e4 :r9g?`OVǔ8mQi^oVgPHZ]xCFdE# D[>VsVokw7&4*H'\ d$]Vmr달v9dB.bi:__xW|1=6 R3y^ E#LB ZaZd1,]ןkznxtK|v+`VZ3JϧC^|/{ś}r3 >6׳Ƅ%VDSWn 0,qh! E-Z%ܹpU:&&fX+EǬ.ťqpNZܗÅxjsD|[,_4EqgMƒK6f~>9|dUA"{!$jKx E$K3hN(tÊ-#v N, 9g80Ǭ&VdӞ5W1!1KYd`,-*&>F~⯰&jb.~cNk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc忌)cl*&<}P {"3DZ笪}閹V#g8#,;RTHZd¡lY}1R/[?)xx 찤Q!b%U=(Kb4 1\)y$!M饸+ wcV?C)MΈ^RNi?u Np> x삖A7+ u/~&ӄMu_.<|yi I?@)XJ7{ޱ?QC{#؟\4ZfR-dVaz./f+yGNMGOK?2_~3\z=y}^G$*A! IcuR.o=MZ9zA(FrTq1F'/)DkGU⢜'-!w:ME7mf7E7- 1|Lp' *=]Q$v8FHӜ"D$aǽO8'1lfYuB!'¶6!=?8[Y|-ɬeǪzd;-s~CM>e:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~Lb }D>{N{Vt)tpDN,FCz~$)*417l;V iэ(_,j]$9O+/Sh]ice wy\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgSAW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨Or4 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~|Y4sopp|v1f2춓tcJF"~ %G83ʼnw&C`4׼C*%_D17~U]]gK!*H.dM%HcWqןl)*L&%8H0VX$_4/ncՅ'$0V(+ ޑeQƕ<66BcUJ&B.!$xF UaSV٘:ΎW<)_jDƪ0[yh Ne <xRcUedw;V3X㚗7x?XUᦙZ:׋Aadt9XGWz Kc|D8Jg:ьiQG/i{ y=^S>-qrC%X% d|O OIي=gnX<,//C nK+`o}|:i,Ë,Oz/+4Hz!3r iGzdAgm,pK!j&\"aGo4iʹcb\n{@V"/뙯A9_qϴ~$_z%ò^qz?U#&`ؾ=0rZ8L/HMq.s+0yfIӒ/H'73"P/I?C_<} NMt.t´/tB.,gjov_yR4ۯNyYZPx0E`1Lb>pi_y|bۦkZY/!OoXD/c#nnz瘿D@$T込F1Kcl\jlDIkyv^LfwU !Q5_V'YSXjG̍L[QbQ(c0m`V,iq6xz]zŴ0{ ŶwqYj̻Re3cy.|߷i0C2BnZ° ? 9q#Q3?'yȒ4o| g]4x2Qџ2aEtؚLrΒYg-Ȭ*YQ<}4|8 y-Ӱ62d'l>Ѵ[ִ7N&3Q1(DJM,)VKjE\M VTI@|r2]lR//.&U!y:?WzE2K9YuQ]jV겚k[u^EHu6rZL Z`xS&]R]'h=.* ? fJ10koUyTO~tw$(RJ&QS7~ QNGYYu\4]4%[8espQ0JN''YOAD_O:Bk;Kx3'hKD6=ѭM-oIt6 9%h 'Ȩ/q'EY*D̒*_=S\};2["P/ȣe'Myzr$u|T NIPݡ7<KEt:H; 6ʦx`w|:/~U^^8g0ougɻ $ ٯq7⸔t7fV.t VĈ lA5$<3Rt\\a)HP,sg( *OYSwMKޔq}C cmfoe [Vw l;!hu6%P1 vӇ!f=,;'Hʱwח.;:;s7mDD, z >z4t[nk i3IL owy]m@*w2z&B@&ߜ7a&늺h W, u7I;]b ow>5SQJ?m'0Nn"˂ gS咣 z6%@|Eu?2?$KBa 3lzB"4YR\A~^HIi\|} LķОU) (/IHs}2ݵCS,nKϦ@`|ЙxIʩm:aI#}gzbI8.AC>f#{ӵI(K_Nj&<N7$ 5ز'A!tΰTa rOqYÖ!Ǟ 3vW[Rx,,2?6Fp)O( z۴`YՓbZp%ão }nlc[uC} ^ gUCqIB^+_`:{`^ge7>[i2?Eu4G]W솘bu)Yz=P|i|?[ ʼ)3VTh Q,d+L;L->8]wϓ{=4&Vi@|$ɯКbăN>N7 òt<<^FDM DR6* ~%p$SvNfJjy# h[* *DFlg[b9Ύ檨=cq`Oso+աQ=%ٗ>3͙XU^p~>Jl ȥݧ!8 /F4#[SDjr/fO{ l Hk5-c aE.}|ŒFٱՔ1bȬhrB~?D(ShT,3K>2 &nGW2 -\VCWYٞ8v6B"gs3dOUR n<85H?xeguU$\MY +PR`t@$XېC`kz$dĚ>wukwE`x*k<k[@ Ԟo%Uޖz)kg[VRjuɲ*yVLuvPy voiUk \ p \c{l YZ$ {IsFel-*FfSDs᩺ ]`"o*y)e]:d Ts5]}7ҵ4x-Ŝ¾j,ڳhaazmϠW+x AUo!bX s~ 'd2+*?D=u1UF㷷V֍QG3] Q-7؆kxE@;4¦%$ڠK jUUH*uC-)CNZ>T? m"*# T->L2ӵ@놱L+HKeTkxx!:ySzĸi ,LcHp$;OUZu>V5< [ͼ@{lAuߥͯP*XiU~tC ba|Uprsy`YҾTol`sݢC"LЩLyĩFLQu֩5LWy0!EshzJ {)i4la﫶xTStծnxV5qўCHWrhjrEwv)opu^I>2+pv#=ceV:|skfw?2ù4l6|~|^ lep"#praz{4y'&bw=6 AajX'}bHW9yßeL&2B}9<ܘ}q^F(~@HLC, \f calBzn{6 A9>MDDHwX·<^n{1xND,<ԝ`"!DC%G8Ӷ !~4r=u@0LOF; K@ͪ;~%D{1 >E0OQ`<#u{/FP!B_o`cHGTwhFV@vǖlY} C(H &gyW]|4r5L$b,QDuqM?=|5G4:w/szXlģa]@2-WSٹL*gʴx_Tw@?I"|1ޣ7,Ү:I3ʣQTV~s,[9<@fFۤ"=t[)Qvq) >;:08]y}xN8n7dD֦g ߖ.DHvx醫xxg/sBQ|/'ݮyv P\K]1. F]n<|i!zP>7`шL08i0jƞ N Ͽ ?4[2+P ^u᠝Aj/ p8d3AWm P/7ݕ<FmHV`=./~> 0pND5X  F:@:w\ <ANawBn$M>@#f:T: \\ǬAOkg{nGh;qx"[NȕȂkAH}t«sk͋w9T:"w(< 0&QRͺBKZxQ 4"-hql۾ĥ2t&aD^6s,7$ 7w'h(ɽo9YTo^ߧV;yC#8eSȥ/d%'*A\0\n hdC߬`)S>Քp #ָB{`ڼ)0HZ4*f$mJ/Y)],e?9My6 q|[w zhG7_OO&<*aNtz xz,G[r2O d_6% &D2ǜ38ȋӘ\E*03t#PJ6sb i`gJ L*; QY<5ohc6hP]Ny\]榕|,D4 ۓ & zOQY]­>N]N*"k`M=#SwIEenZU#ъ㒁0ju'[$OԦF-8]x"t3JxMmcYOT @6>,]k>LY>FS<¯ aؐUvwmPE|&s؆|nVg(nPI!pAvF>D$ʚCX8W}A8!0,|.,6tZx k@Fӯe’L5Z1Ʒ# ˷Ʒa B/߶7͢pIz-]:Fk݄Nh'wvOxTO`G{:+TTu`'R/|"4ՓY3VV:W\N4?-xfM11'ޥBAvG1M:3zjzI 65@nzц,r\D`%outYPt{A~ҥt>;t*ݳWJՎݪgw}vݦwٲl aDe;Jʶ^BP{{AwBP,(BP|?A-uu^Pg?Au]BPw{AuwBPoYPo AQPo AeA-OPGA-  4^`?A 6Kl:g!%X%"8XUE(=8u+H>#gdAB_Z_. k/bcS2bS>zU <r''q:#*,<DֲEcwUk]A_|e "N*j0gLD2`+#^T)=4A@w>@휬3JV;K%#/ի_@L$ ң2Vj<9˒lfV6ˍb Go/Dr5aWj, g59W,YUJq.p#⌆"Ad*ElږbJDLEjb LGPIb/?l^2fOҔ(Z zXP#ŏPdH{'FYD "P.2LPREQI`elbeP>K!W~Шw5A@H_kl5nZ@q8CcZDR>LbaCY;bg@8|"i6&̉l@-@BCʨ߰k5~fHEQ_4JC"sE*U.gb *$Q7(6r̮h`PW &]PMXhj?qD{69&Q^oY%87P_*KG@M}M}ϷLpՓ+t8H'Rs+MLG$ PA`= uG{U*QHޥ3="pދ'g`1_?^:p,뇢Jڟ9}YiRߢKMJq-ii1/KR7 z(=nwbc0hЯM vo3źV#] [||fylm;h[3b. Q] Dr /CqzDX`J{Z: E)56 64J=Q" c>ɾ΃<1WpVA) ͈$Uu&J1o')HrN`7趣0KWAT5>lZ Hnpnk8z*tn ҜS=dmtusy]O 4cJwMOJޱGᦳ\vi.۲3_.KO[8ģVe#^.gGR U׷ KZ5wjb^n䕁3ԏ*^ ⬗DW.#""z!fNX#cUU7>-RcPp>=hL*ZR U!{Ӆ,qzZEB FT=7ϋ$FZ*5> #}8 $8z3dht ;6b-$YqV"I-nYOZ"\7S%Z{KL;@` @ {~JQ$H2S͇x-5vIS}jBbϧGs'.ǟj_߽i~>|?'w_{Aqq<+)z[)?Ou׷_cz"1j1VRG?>sY&g .ڸ;G?mQw7v/=vrrޟjk770 F3ݩ1&KfX;{ '@@(fs IW | /+XjE XP|-r5#{b$*j,p?OW`>U g)j+^jzXVxGD^i5 ; TKx`! dEjT1 G3f E#c;e){ϙ(粆s1oF&\FZ\]p!l҃Ҥ>6Fҵ{*R5Ť9 ,jo"o .Ivx=\=vI ccTd`scՅ?0X=R'tCz$801ZU8[o핌 .|VAm+w$wسAiA˼T9l!$+>W;A&5pմOy*J֖; Bgbq38 ZZ=52 .PAIH[7jJբ˒ĂMHSst*W豱:,BeUd:'IpW(wyrH@4c3:p{EIDǼ9n2GeQ+e,Gk-JȔv:%g:yQL44#*nu1l}&Mgn*޻._$@zwS(|R&Lw6%r/,?Q蹾֐ XswmHpTcx2hX&X*DgڜVb FҘa?8Ic4W4[<дRIxI/ X4ѱV ͳ! V4i5,Z+TPY0 ƈޅ*9].?T;)k485GgDN5Y%,8pֲ5H _weIcܩ>2-2`'Y\:^6\x`zA= HٻG=3VS0ا Y1(R7 pEe{t䉥$ >&97HbaKˎ%qEe#G8:e`p7z[K,UB994'(.>r>OcPVlAgtf)k8>-!;1 >584;|w6ɻ!)sfWoIT#GFۺOFٱPQ,gRA0C/"A|x 4\Xt&+ n]Qܧ!$X ޘ'c$Ӧx4n8co"Ks\߮8vKS3[L61W@{} LOΰC )-xid$=m" 2!=xE( bFɴ噅ޛʕ E&8#hb`"͹܏IviH'ߍcOKVI3fI)&!KIl_ZtMSpGY??oHpꦆg':{÷őhu4xgTnl"X#V$Bl#v=wDdFG.8o E$8BfvPT-7BCw`V ?Ԉ; &TH`P/w4 hGPs1Ϟ a"J㟡}rcM%;k RB"KHAz16t`İ]|tMDoLK|1( vdevj8 B*4\dpR`MU$#F[/VQO)d 3/NJc[Rk.AH^r%Q$Qabʼ7$8܊x}[['yV_H`6Ng̤5y6>}?ivđ= na`ap`۬L3nPxMh̹u0CuO9 ˹qLS0rg >D$ŀx؜'r.?dt9 vblDE$\ZɢI2` ѐM՞&}՘)9sG\x2kK$8!ק- 6T[c^-+ZfT9a)#qi͚s3gьt)+ȣO#;VwS5" OIcspL5Udѱ+tsQ66=m`'PTRUVαg̶3t嵼|{'I,-& ) E[tB ĂѠ(ulQ5K`RR:oXSw'ڑjG.X: T~!_Wl\E4g{)Mc{;y2.[[JЬB07UjS'`s!SG&np1eHpgl2sل1׃.-{+jKhS"kV2Ly.r !tUY-ؓ žjmWXЛ+L(Zc)u$F{SeWUG/ER\6SX״a{0syIc7E)|!Q#V ђT%j]Zaݔ;3YhFPXM1]ٕXIRv!Ir3Bsۛ>:e}<$6˯Y~ v V8'K09~'o+iA1;$ O5*9}hn!tftgZ[vL-YʠrFuϻ/47Lᄆc_} rNQ峝&tx]\D\2i 0%dRx°\16B31"X O~4Q@{`zn፩c 4E3c Y[Z-4FٍERċV*aUeyv[Wtmö<3/hߝOwtxK"!Qځ+Y;DVXkJHr'yQ>ejۥPHӂteZÒGn 7DHgw=|O~yѝsf80:b@tY֒#Ez ;tZ=@K: #HOr?d z|,>c| pyZdP D9Z4Jic/ %U|-է:SY5=x{H(Nϙ(Q#+$c ߬"7E;0T'\;u;w[؝{0z^}n~r~Y.v40 ."FkLU*$YHV/{0s'/ꈚ FuDw08pݗ~XM LR׃Lڼr ]4F"# f` ꏰ;HaX˱ػ6W6,"C7kYC7ɇu i B\vzf()$5F`:;r&ZnXgUpWN@Okׁ)/`y-> [yżnoVw0LKi4 Fhts8@bX)~s,>ry ÌMbLz3wUPJ|1FL՛F*9uum“|ut%fh)"tfqykF׵قkom7|n_ ,mggHLKLWh-QqCѤ;-jdORKTm!Z֎.ǜ1/rxw Q"&a8ZP|0N<ʆy w/:o>&mI$IhnjbYהߩp)dfV4E>0X Spx{8~Aϯk.zin= RcC7~/oaˁËοyow 81 ?$gx?(>>=JI[hpRXB(YOnp0 J\?{`nptM] gA$]JgO*U׌F {f<Raץ_ۏW׃0tӎJGfȉ2T]r*,1q,Fߚj͈͒B&oĕ-y3&mPZjJ*IT$x׶G[ ax`m(Ǜ"2?:Ik8umUh$qB#CKePżgDV&y U#m2hWc*HiiT؍UX6_cc*nqTU@lQ-ŭ&A`Z q-4ڇ߰a~iZ:b YlB-4\sɓhކxIr.c fR> lSViMa6,IPe-k=#$>Wc4Y7Qh<6w0R:bR>/ZI_ByIB _R_2hԨhmJ1[|ɾgRl:7}ӞNRAYdHAF=iQ:6)o|ehU}&` 1H64ytBjil)%ɬLWjje)[ZJ=b< O}gF-xqitYDI%pN.+<+Fִo6@;O8Q/6xAE<=ov:H_jϏҮ4zS)ʃUv9* |nO*W|b"Hޫ tp+,>y-)3ϪƗ quĶ/'J=+CX-zGbx>f{l˴4kx)I/^_xdA*:L&*{L&EhSZnÒR9[ņbxxw }NJ0H6%캭u+Wl`IҮqbX >^S|\ֺ8i#b|fӿ\HYs%X4-=qa*k)~?PuCΠ}Io8{?^x4FUCσd_{R"!}Q G^^6 0s\JC юeA#QX}`9aGsXjJ3q\?ubD}gjn޶=_[-IbW*j{$' vB> []ŋ/kAm#@յޑ?\e~>n$ܶ+LieP_UQm;we=ە{ծ^/# ˁh"b= 2K\n \ȣ/炑(ݴ {f[e h= Z}^bϤ ^cAēF0ĥx G6 RQDͥn9B2Y]Xt/ܲH1l`돫 qK 9 72w:.yb'QqgHUTbsB#EZc0!cbVk<*k}$884R:kD /|Wx~,=?;gӎ ] tT,*qgy }}%F3VeUAN0ZBaϤu(˴ccMM;jM;oZьUQn]3; D ~ >$$r`?jFӎiFqMhƪ f-RtbVG2(1!XDh!":抬M;jM;t,%GTT^᩼gHק4t;m]tY \e scdf\Yv9n֞fM;j4͚vE ߯!BE)$0ke;I\ޚvk/z)?%(t'bUQ)V74qO7'/F+iӫwod棛Y2kѸ?ŬoE_pH\ǣx4bT:Ĺw=c?ȾߜWsJm%Tu`J,"|I|Ԋ+ׂK36NSg@1t{1ZGUȫTD0e[3y=$:@6M{5X>ȍ5\ZͦFh2_/d<|DHg.تx p+k ;?{-3lɻ".<8G|ӤW]3mdݭYaَKw6_p:U~B,sUr78E8Yl@;sH!k~7r)d\*znKiخn'_swywbݴ4%B/_~K ڻ~~D0&|għbxc)v@mJ[:;w꺓xsAo{( {W CQG%;SW\sp`|α9}z^u5w az!B-ɍiib1%7{o&>t5F ud7V].MOhRb^kHZ]1ẢNr!fk*Y̰R짂R^\%%{+%t#lH58ޯ] 1OտIU-gL5Мuͪ\_5?ڃ__BV>Z]c4q YZQeރ)ˋ 3/k8Wm$^F*y`pi)Sd+ܶrt,2Jj)hJy6Qagr&:%!K/)B*ft.lhv4/2_C8k1Ii|Bh[Y0|">=3RfFF`0O=EGn`Lj01ERmͥ!&'c-?iOj-ZD !HՖ͚ʺy 58+3 **-s0")PrV(8-JmpU\|?`8~X&ka0i1LU i<.4x[ψD@("sUKkSNoDVENhy4 !J6*i4)׺@ȥFJ0υEhU4m!9L>DN, AHriD"(W]d,4T',DHQ#MӶ4cQ[ uN(N> lf><^dO8%E ?ViЂH-[632gþ t0dc]tvq,\QdZhַ5+Gb<{zJF>aVOe+P \qCQJ{'|Dx`y=Xq$UU桗l^${ "UQJ D{o)UH\}%WJ@JTڅV#b`1 mdR!AP4:]mTnh@r|eM[~x7#W7xyi)"^v}H(m= ^Vq/'{ju` Ex ?P(kEue!p@]HH`j"6/0 $("D"@5ahsY 91T3Ljl@DhD̀o>! "x+z =3Ns9Q@b/Hv7O>j8+ @9mǓTH%G #lJ9q sxvR#5H'YH0Zxm51x@8Fb'eP15M_!+mH6"J}y56Ɯ)QKR}gFP٥9 -꫎R4ؗFr~!?L[ )ܩ]wO+DYkZٙ`kF{5=CK3K j"3i{ec5gu$NShO}$5,xM;9Z,x;#ZiH^d^ !={m㹟͹;l#0IeV++$D*NjsK*:I )rQdyHL(|Id*PçSc}i"'Bqu`` I'\H&l؃Ak(ϔJHd8J\,3P;HV-$6iGHk[Rl، l܁A产]K^r~ܠr Vix`ZmHvkcMYZ+yfe<"ѝ th6\3(լS2;4('Z{eTu]h̋WFkDbj&4Ĉ6K4QFYi|H&hNdĉvHTD$15idq6T["Tċ ,5I¢GX3Q{|콨4\f:˭3瑢)'F2yr>!0*yh(*n#*XxIy(:If: 9Ui*3R8L8릳j{юB1LII`*ےgU sA§oK؁5r`b[1;[ a洃FWhsC_^`_fdqz'Ye+#WNHXj ! ' ?#EN(S[XD)0!g'MHPI*$TOR8wvەu8Q-:QJLn?FFG W=3ǡVtj60ufqh%ݚ Ru60ixWf`ڧq127[#\f;&ХH&lO$Z.{=\k:3I $;PO#bI(SqB\8&0  5.'ݚ Ztj6h~L1%y#8.G\8nANd}/΢U̫ OՈߺauCt.S(^ej \Mq9 \e:h%x_8rwemdi9([F n]mWҝ /qrU`xo:.r_/pg\srq P+s%\)Cd OI`+3]۔]Upm 7z \sk|5Ȝװѫ9WxҋFrӋڬfchj¶ƒ^7f>]g%@=Wc4; jH7ׂN]6d #G_'7Dbrmk&j9$. 9<(/X% fxh+ 8t6ΧWgK9׳ؕd1\ ח]X5Kn,/Mtlۉ-s4+[+r .SA=V{*?9JZI]byi8Y U"9q~.Y=FIM3A!d64a~d=O^}]G//?N'U:O 0r/ג?!_p)*0:|2t/a1n ;4*B(ϟO.?`vmf+|v1Z)i8"F m5 =)ETpKHa΢a1wFIGF' LĴxUv1-|/|9핒A,n,hN_MS,([CnмԻ_)|.栖z.uC};t%܏CkKowoh6uxUܥ~qȯg.\@\T=N?r nRL0iUb=֮Ou8L\047ڻE&xՌZMFDQeln'~Eq dj~˅Sê &zUI^O.N ݬKV;XSM4SeTSaL.4JOWOdޚYݫ]Znږp2M0unC{ruYv6]d-jxdYi6konꒉgE6t Xͫݎ憉gZ%v*b${es]Z]#ŋSAF^)IKk։%=[wprD#sx8A!s Ԡ5p jV*6nq Ԫ `h* ';^ `Pð ?1`"%`PGa)PhS*;4fwf<6/QBk|ʰqs:~xT:pgգøxvM2[A[cE|OQ_y>oNa8%syxLYt)g[8:WpSzdNJlĩ0WIq>8Lӳ w-՝o2$d~dWqRE=x j0πr0Z([LZcBliсl0I &LC>Ba5. ˌBkAjwyhVmXX ؐj oC YxF4∹=!MvNniG0۳nV};Aa-36chR:&'> b~(ֺOu͝sk58VP!agc^0SWU=:H\=ܪ5k4NGd3T (c5%8ݑ㋁%_`M;86p-WlON[6㻿gRgs9HZ;eiaP~&8QY\a8e ~XdgŴɶ?=1éI=ս֞ZnxՍm>:NniR:|[zo'-1\ ? ,/I E(,p?ZZXz3jI]8lV\F%EW Gy#V.nZrcg3Xz2@(}io6gZ:%+ilvғƐ[GjE}Sl]'!k^XeK}cvH±*8SE#7V8n I# Q8֚>~[ׇ&gvrKau&'d'Z+(Mp`*.J1s1Yfm3y15arfV.bo<<#;2"p9 LW!3"-+fcfp) M`*m8FȰ4TAi$ZvW,sI*bku= #8gX00P[!*xo&`Ga ZKjB"P le{ KwwPnd=0{LHG*ke{Ύ%@(.%"D"07 [Bevp8}FU] X@0%6@-U-mp]}ӉJ!TkGFZM2y.2˓YBdfs+8WIDU$ٔptۤ-4EA]jk{HiX<$9댇JI0#kXB>bu¾^'ғFIy uqy}ÚM`rb a1u9 Z`' ~vś\%DB< Ԇ^^z!~Agb\e,?$HM(On4B?)Ckkd]!H;U&\en++ a],Uf[ -8a.mW(;Kт`Fr#翢|t$B$p6$0r8FWIp=dgGMJ3-j8]|>E`" ``XI.xܸq1=,[z$ [us?J`Ee Pqޛ?>n K%s9K% m4LkI˶WӬ3_*I9{T¯TuxY̥.s YvA.v1eZ :iQӱzn>:31:Y\_:uQN+넑ruTݙ7|[#WI{ݹBNs 9.+tbr0wX? 1??f8@(^ LsTxq x|7}?I >yaQiN^=q! (^Sun,OH\('lg&| PwEjB?'MMKWPؼދn<_/Oձ{OG'yGgfHB ?aNg|$a#aF*ߝMݛ?>n l=;l4ܠ[sd"$ Nܤ;`euǬّjm`^ip{/_vc}ߐ? x}Sc^;ɢ=`g=\{)wJC#- 4\Nc`oF3@4%iQ ЦqFkf2аZl6 L}}b3*C goUz‘w=ݣc cZcKCC;[&<-߼l!x-`{-G0)f>g'؇i9.gGѡ[2gǿ14FwTÉ͑)P"5 FkP\`x.IWYNvBwEoɒ+14m%U,<@cepo8FfP-K[vo (cPf*V t"&{_ ? Ԥ5[Ŵ&;ES$#Y$z $kW($[F! 5) hx/pĺu>N_Çfvʎ@8{Z7Zj/?_W]aTM?wų-YQ\v!sѐvxHHr#Y AK{Ecw09AP@E 0DȐ#͏AYTY.[Ts`3ALY z`* -9]L@*@j\H5fٛeReS ~n;x<ބCUk(,{({bV `jXXL ˲7^CZHFeo˕֙!R-cRH+@*YUoVTZԡ+)ʚ7kj^GB7RR:Z͚w)=cP@5ˢ7^S9VA)j6ެzz=FE#aԐ5aTtY.Ujw#7l+UDz]a Yfջ<իՆGA"5 ,{+{AІ[^ :~L2K.R}P^Ok(cCuȻK!6c5& nkU }Z]D _sۿ~ {&ߟo~B?Q{n7[Ap/{ ogyB7w|ka~PTVbYMQ5XPlYH!%6T7kCeNS ?V\ϾWoDEC[Y/ ktD1Z9+!izMW_wA,7cz)VcO[; ]qWKXsmD׵4Jjܺqmste)=amZ?˺)ZPCšhԊ*%E*wo+n LAVCWZewy]UM7o\~gG;V1\ VĎKW1z䊄ZT5Hڒh\HrscC mi ߪ88!!0;R6m afˏֆݶ#DxgAjQs' HkA{"iUτs7dm*m}QØ&NŒvbVDaOg#k;mju-z'{zH4kwRjy"TCf)\(,R?va~n~#-pwt Рt3Dt33d}b& )+#EVEBj@,z{{@F(X O  [Eg`g˫ag,[7nr~]cAٜ&7&B4MsVFƙbrjb(ș(E+nD(PIGq`}o#"b K>xAj9)&jdA-:G&Y 8?lB>xŦ4̶)QErn 8O [K6LۖRSm0ƌ4:6q 15j!>AAFcjBA냌7?nο^l?SR 0)rg}vNWVm4" uT!ԂES}L?Ld?(d5$|4 i"9ԈS^L8 2ٜȋj+S!ȼjdu(sp 6(Fo%(RF&&d5\d.FX$bg.+Lkr 7y@Z0B.䤹Ĩ8.1d\2%mm,]ŞWwX/m?2ή|^~FTT5YVZڢҚҔb9+(Zgh\B:p+ʑX6&YˑUYEs?2^}?"l )kjE ~ Gji_Jhu˫fZp8g, t _&2W_/7Mu[Og& в![vg{iWn*vF?ЅŪHoS9 PD 1[3LMY]o^o•#0aܚA7,*{܄޺h s}ń^}"*b%{E17Nq U:*oߋcjOQ$39zHFrnprS1[f#Б &Y sYpI!QU@`rىJz ykYOl<""ˢY9OGYFQRAmGFEm 0wX`˫MÅlDG @k+}/Uybya>6,0֪ -`>=j`NMMR5 nyϛ"BHƵ `F !&Aț}JvfFfF$ x7+R iD `l׳I S9GЗ3;A1AڨƄ[ Xa>ڡ* ]H$kϪ"{v`evwT(S].bՐQ9~i.BT&Wƞ*<`d"ACZIPOU*C}BGsGCl3c#!ACU&M.n^'I7ͧK8N9\UF vH9[ 4h4wQJ][oȒ+s9dN;9$v<xiBdGx߷II(GMɲypl^U_WWWWn6vX'*88ymǖSc""%@d7Hc 'a mf0%F> Hk@٨Q7B$ futO!J$ ח=ih" dB,1]@aa' Z()*'A=…l@PDncCOsDx ClpzYVxU TmBUCBM ϑH=>PԎd9Aw^DZ$M:8)Ƭ3 RرO'hLHN>02n4zreO}3Mn1_7 2Xkedd a?G5@% lkD:JUW`]`y?nXk@gaQS B9b<clՙ v{jI$@!ҵb 4>@HkOwkΛ@øukΏ͗(flm@ 5gO=0!1Qg;gx,)L1|+C)f6A5F n&=k)> ۾:'eR3F}~"1r>[#Oav$7 &w&b#ʄ*y$0+j2am<[7wfhߚ[%=:> Zئ)Q8ۻZIeK gS3i唡3(*(Ew![R^<*ޤ,wIo{|evhn#;+=JƑJ>^}w @)t `[Q%0}KTft3APbkN@""%|ړ9$λ@F:P>PݼmsHyq< )$%r(bz!~*(/zYi_7LKAj#sD(OBڀ'ݑ'ݑ'> W; Q]{M,:hi-UûQp7_` Eg O ިe-+נ I8{?z+> ,f=kMҸ NƷP%$`< l{#h_56޶Ch{7yR9@?UJ.п¤a־~h}"PQ\# b(?PK=wh]ffIFAo/3wjC]=jjl&̱ehyui|vKI iyqEoTjXa7T({"6-"Ri)@'Ğ{)K;KY*Hc"։ZRKZйP?;_kFΏسcKKI]U,#-cKu3/!ZDUhETV~L^iUp。!4;B)dwly~*mB\bLZo֡Ӊ)T'!I%DDRS#q`&pDH1D$? >H 1^O(eO3Ry!h?)lOvg@=Xڈz˶uwZۿA*A1x؀ O8pV J@fƪ=[T>FIvOѶ|N >f<-0 #{9># ep\&0>QUn[%VIURB1mRa.ILjVIUn[%]o<(c{G1&:?c;?cb?P.Nj䪼2k~M1EJ%8,.S J߃0svڻ+3W߼,~z9Dp-l38 zN Af~^f1 ҖĈĈG{pRbe'Zq١~o9`1Wcm/{d ;E2)[ъcmC؍sf$taatpjs5 cؔeS&fep< NE =ea~O:0G'gmzzf:G+ݛGp&0> p d [0}9UE%5k-n%Y`zrKj 1F#زՖlIrokUeX;g©&ljN߽jưdόS[)F#Lt]gf['yn L( ii&1O ږ$\S#GFq-*5?eOfd1 r@ +`xE_e82]AEyV->Vݬ zHfa~e?toFk?KӀ|мJ/ `u}l1*7~v2OUd0d Zϳ2zE>n;P ;`:ݻ3} Ti_/.z.Ɵfiܳmn#El{-b+aGplE`< \2~`>y^rk3eo x䜑]pfCNK0;{UƁuߑ(:N`(ʏ?i5a`Lok05cW>w0,Ʃ-\my8I^iZKa#lNKLrzZGf.PHg;W}CZ)sp6kİARanf;">!umcT+dͦfnVr\~QhhXb~06FӒʙI'&񱛅˿qr[w-ٗ،/y^vtݼ$v5BφL0WnpNX- D%׏:T_ulJb%ӎIv Xr[bV9H8D{JLWT%VDPVƀ /0e˙]Yvk+فx#IgMHBq5.v$ f Jgpx:!hcso Kπ`,n ׽9H`dp {xLRSpbHkq*-gp]& Z.vMJvpn5˭*yF?p75Ήz`e8 q +iR#2 h QҿLbAx&MޚسAK`W\lƷvA_?SǏV:L}D0">lTJa2YfclsM2a]K Apg!N#3F8K{蚋d/2* [ ˿AKC+1ƌeI\^ iq>\v}糛Iڐ @+8%ω&Wb~jvǿo/o6^ h1L՗پ*rg}9O{犩5-SF(oq)AZiQj|`8B?5D$;m+Չ8"VCoc@ȐMXȏP^-83m^szex6?wA#ޛc_ 0m3iƯl{N]`zG/i"ڨ3v D y:uٖnSk~.p3˲#*iU1gLo-\cTs?Еř"ETr3%X)\^GO~"87.z^\\"O'gdYP^SkЎﰡdV*C7fwY|J7ӱ]aqރŤ;d>صwYY'ԕ׷g:T:wEoŴ_|uSVopiwpy[J+hGҮw;b@Q08 C1bQ`foqJ##HD $fqDyebHh̛+vV7lZ۹7 )cPФx SMq].&TgI&\0!d~`Wd<]00W 6!'ϋ]E24 g|Y.b&嵼0kj[ݭ.(s j\_>Y_,d?[Uk{ை oٻHW~MMއ/ =B$&$g;/V(VW1G (2:*+3\1~zvI>TIc6;_糍bV^byzl04o?1phzoۮf~uK]ߚ棹mi@p|2:MHO >pB ;HtρϞoL(ŸG|N(dȢes!,0g`n|6=h?xؗP^|Kh3@wdvq*YvWuu4EM6lskvYFoWı>vr|2!bG#p>9㳉?.z<}W)Ctq/;XvpZtd~LG&#ɬ iE%|2ko4 B+&z[c3r_vx^?F-޻[n,].RsEX/$(HR$:?m7~4OܲTi~|M%d<%bDД1Q:/ɎNE-ȇ*5J=vcP?p셄߬<RBɰwY.|vlYΣ*IDF).YNTgcі,j)G>T-5_%䵣#R;M%gia\b"鐱S9GK8*㩼(u=E577 |#j90̙ ,]<P;tr @&wwUgFW5R3X=p.)Ʃ]`PC߀{Oj"7u։t7ů̀=최u&4U^ld]WuUnrvIvOug@6 .&R}#lSY%VZVY[ ;6)I} vBFrө\ 8D)dZeIyeRғO< PqߠjQ#DyFIǍ=LKa .ZG#-;bֵT#Z=Sfx0^Fծ 'FC6R>|Q"%$isiYPͺRb(xW"[{ !֬7RH1;;Jw'u6dݯ8K8U[`briݑzΰIg@JMTgCI|L=1r2P>~._Tyiur; j!˽tp O4]w̞)yx>9rv_ۅXٮBխSNnvApcspu۷(⮷y1^z{_͚`0%~u~իS%F#)EE f*8+ %fi.oMGiYζ~-r̢|&iė, DF0K@m*6l#oo{PX۳yw=WvuOmzl7T 3)xpm♠kl5(_׈j^pǴޞCA5gnQVĬ;&.K/`Xr7Qϋ 1ZJőU/ps?[/I]bwZD;p&(LuIA :/8JB {X6(򡪶#v:Ĵ.5nnTCFͱS#+$}ws|!Q_day=, }:{mo%LBɂpG}6+ {Cު x |l?~-jSQoMRأXj-TciXP^d[{ц)(*t„s8h$N'JP. +{?,mȇjwz8鬹a F*CiráE9FֈE#J Y~[Ռqv\CDls XgQi{;ȇ*L5\Qp"f6y}pE\3ޫHD"Il\aF}(DМn1rpu$OgMJRDդC^mȇNS oI] zRTŠ5;N6d=qEpf@>$< rwT!*Pf}5߯{LmFuͅSMەBBh4Ad;cj+c:@hPCU>iǛCtuC+@{*cx+5G$_P;d;T>b@_twt-h˴!=aic̈.u\˄ x52dM!lS)K]nw7TmWslfd{rS^RDĈ-M3\i};sIA7y>zG&nWec9= <Xc(G 1+rMtBj>& Ee]=1 | ࢓d} ;'6WŐczSD!>E0]>/>&-fWi!CEo:dQNNrjD~>rXY/_Z!Z% UBxeF{ / 'Tnxz6ԤV,a4 &( 69K{}A_|Rt?.wգvȨk5BMW}G' :x˘ܯuN]-W8 IPKl?"^k&p:fr~0E{0|'䦫~pE:d ̌Uukc~wVʨ97hCuvB!:Z(Aeɉ ̣W,#̢B".w&T|c .yck6>h&)`Tz![#Lr#yCJUz:%Ye.]aF>T{O {=LMP3{ЉX(*֣vFf~ɍWrP+X5w8)ݟ(JnBړi%pq+R+NG*~Ǫ0و>* ; k>Y݌{6xbDPTf6y{bڶQCU,]u((wuTd3YAl#O|xfXu~ڇg¹8ٲG!Ndt<^Z#]tɆQ"a9fXq<#؏8Fh8FU݄@ <¢ =Ž+OopGԶ`j  =GXҰxӬ |tX˙iCئleE"]*@ڌ+.BO7AXumbāopQWv+@K'҇=sIJg K؆F| "uw1_PjuxgP*%wx'9yμ>u}}o,`<!9dbi"WEMU16c0Ahm>EOc\3U{S Pۋbjvc^K!44))Y;Ý>[M(ghCEm+"T3\]T5VCFeuUZ5N?ƻ2^@țӮzn1] ɍB6vȨQsai,ϙW<[QU?lxx(b(ǐDܺd͜ q-̈uJMQ 5C09*Pڗ]J 3̼CF3͹qQq1;dh:FL.qVUuU_\Vb2{& jH%R޽˦p7*X.:m?3?qGM ʧZ1&cZ:_=+XE]YKXVR>hP4hx(e6V<)L*RW& Pnr1Nkqp^|WT; bF8[G;=" ȺG&pǴf_MՒu}uff{\ʂ2*K1b:LԤc4e&yCSƴ)RiZ"]BNיJ#uM\\ts̾ƛ&<@xGScnwN:dV\!W-,f!)KWկsʜW8uG5lq_˗y& NX{qCN'DSpϞ)yL8]"rG^?y㸍_1ė8@pK@>&HfHu+={WlY~][074"Yì3btݟ/;q٨޸ &RӫsRX l{*a"HrgrDaﮎtߋܟd  m=8&\eC?UŒ$IY9,=*|`4/%g8͟x5*)DŽ(1삀&C)nieBzR9=͈3;p89 F3b2ƙ u=޹P,TGG*j68Z+QV~ZYmN0$1ihRET'c vq7u1(tbFa`\bIfA$ib0RdJirD[uWǩr% =a0#fd4 04Dd_FuPJ>(ua~I1$,:T&a7kG_@c7Ӹ+ &o@e6zy6`׸`bsMtU`)&!&(r >e':Q{yOHPM5E\i5'z>_>|Q5#1ZF{oJ:ߘQi=8&brC=^73OD4Q4-4&IT̢TnF̋-.j!VE4v<`]yQv ^$h*kTS.Q_Ӏ)e$$!Yd#h`4v%fdž0Ӳ#MH|(p\0V멵g; N)eQYE6Lo˲ e;֜ͻ˩1X;8sH0bn% 7tj8a̤J\7UA,/2hIqlJ's[ : T ,qR%H$:HDA*x/ |o; q8:?ʞE}/* ۼ1wt:ŧݹ6_Rb\^MPb@+PJ, Vr-+MF@)4bLLVQQ7$Lx dp\zry7$0!qt[MIB# YI*n Y,%`}/BON^^@zSSag3es]N6K"{b*t\Zϸ:HqNfAKE(PyaHSw. s!8\s'nvw|\z~C驐WXyPZU {гZXuy Q{wB],K;y\=Jf1x=!UrvvsR&:R88ـy(ӃE⤒(fc=`/H66f;-3wIεul?n}|siNjOIv~MNLD)F@Re! RAjQՉ= Znq#ՉW?w12q^9ZCjxvqщ#9NHI`Ʉgf8"a8RYgTrr44ߣc׏%ͨ0ҁ^~dxnm0*9N5aWI{+8RIRA12WB4 VSP#$YF PP'hgh 43Z@$6(H LH8T]L3T`A\=trݡG&E|ErA'J:9"Q xuC1$~FOyi`j8A$ 1[<[8A`/D:8|UŸ+ʝM0\& ŽbӉ^~EZ|Yd5qP#SD>ò1 c!zu˞n+kn=8F&+P lRFlF.<$ c-8Co*}ůo}*Y69Kqs:>4q0{lQ, ?yWkT#SקVnVI#+=*pLSM:R.IsǎM\n G]ɬ {*LWXΝN1eWqNmE( ݨTT<y A=oe me`{{,Lx,|~tzM (`lf(y,&%|']=Nъ IJllÌ޲ b vG#O C]z0-Pd:mQ{y2Rc S@$4Tl$ ]IghD#aB<ymw_4$=C),OIAXxcqxdAoS:kaCՙUq:7Zf ?VvD~hp6aȅb<2ÎD̈} 1F܆Rdc IʚŲ0H,Etg#]50cBF%G/zc]FP{q0U$hڰ\EBCV[7]50cCcc'n 6Ņ}JO@tO4?PIC{ܣYs_-E4!aE( ᣗ/p}I'`y~}y{틳 i{\3!(q%cWJG;#"ʵ<̹yxzX-c1㷙U_ډm}nn'k}-<:PLMނ |-Q-Lဣɵ]XVvalj[:R6iQ;7lXvÎ95LJnYCf|nߖ+3Ϣ~T,rg ')PD >WА?- /?}b/uΛM!z)P6{?;xK-[ЪZnl5gM+3S䏶-^9Ұ(`(Wg/fխRie6km8͠'$G%nY^ ӸB+PV*kHm ffq(^w/iÝC$ 9E6J5.DOGʸĥ͔‘-DHM1 }Z6֣ƞ4rZJ(^i8g@s~t|Wᨒb)~fX5kq9⁍>7k A0hü5h#Q [eIZ9@oD#V4bmq*+IE$jVd6iFo{kb'o"6hz kMp HN]}\/l˿mx>6y@r>N]]Wz=4~@A;PY> I=DqjA``i ƌعGO/@<Җ&&6`.󈑄8( "SUuWT̈́8 ƪ4 F#i3Ĺ_Asޤ"mn1X6ӸGc_A!fpnQ#CJs6+.2~3z<4]ނvmN_gH&bOuk^GF=<}:0b6?]?4+nѵYTbhؼ' Ӹ6+櫎Sǒ)$)2oAd'`uz޷ :\C5k6paete׀5_Rm]Mo˲Ӳ풜TlT٢6:pv~]-!7M* .^+A6tfRٌ_J[0vQT[8d:ZG U}YKˇ寮i#s`l`9| 5} }_n1ՕS}VPEY! Y+>'=/m`*!}&mfOyu;{Xnrx?х>[;x\.o˪Or\*M:ڕ|k=.:DӃ⎩1P9c-oA,*: ia*P6uJoiAlIY(H'qҾO=o~&orq l?sX E^v?^{ z*PwN.Zp՘A=]Xݾ@yCD|wb ]҇/ݯݏ^- ǟwWQ7a EA oZˆX.?[0šT@ٓMn˻Y=1]N ˼ ԍng?`۪Z߽ #6{n;+8Z7aP*]vA߻ue6s"E(*$ k73e$IxF\e0x;(nGCaG3̧E2A|[5w4Q)eot]N]5Ps8v702 QjK5OkhԏO0fMdax~Shǧq *)s}._0׳ +PZ}b׾āީx?{THQۄqʘtZLbC|gc^EIJ|:yN?CHfbu;*l2-᨝q<ɵefd6'Э LUb,It))ggϬ"iq+1 j7~T^*ѻhCAvn/;A;R$wд]MK_XNVRKՍo#+]}ӎdw(hwOA sGdS_YsRއo%EzAZKS4D$>Wd>O;Cr ϼ0J~ F\dkyfGl2sf 9%STd{{shhۓޓ\TW(ɋӹKRFWꈜ w?Yl g7gnLmF+"}MǍYθ?'~kbo3W Iō,fQn'x[SxPzIko̜ oO;t ](ϹJ1&9M cut'ۚzs!k6 1:fcN P=AUl}'7dDQ|tx * Ʌz䦺"Ѯ &UxO\dr7Sά}`6٠[79ܰ32e8i(ߎ/ۼ_oxSR hNL$́՞1 Vzъzq1D̓XO?]v;C>m+oeF:J&%-)d`\CMŹy"AZ_>sM+rx'jt6CE2ջntɞjM'HwMH~~aomNg^?l `i, 5B}&!;Q>"pV1Ӹ+X_D kn05H4i"^eO ul J yQNM yY,Z:QL $w9#Y sUd`/c@Lcs`9jsl\>]%M`'7 4( -@ rk"B@Vq94hi:jr_Zr5"Rݨ^wwTCNy ;]vt+`54(&ŽIxX~9k1%cgA$(JrAxBW$c\io)6 v[C|hi>l§"-V|wíSO:y?CT*UGW5C4n-)D:9Y(xAMbřHN_{țN>l_?~߽J0QX;eP\'GG~dE/[.]5\KrFM:q3F%bSEaT,?/^V(ۍ)2A>(|:9(d?HT2G%Ԝ< Kr/-.w{f:9* v9* O bQD'% S-a8-^,+tn:atlLkmIO 1T眔:]]zɟx5 ekJn,Y >y &v~,a; dp{Iy\MՒxgS `q18lv\-A684#Lm. zC|&? ֌kzىk~{I_egDx(XRsa[5n4YVCtRRwY_L-"ƀ~їlGgnpM]^ًP$`U{,rc|A(t\vilCŋ G KG\jBR !2"#+gy0*xa?󗣗-V:nGB9G]' M  $Jk;S0+I|6U crAze0ć#wY0&CAۋ? M}ll ж& B=B%Z uGl "QV`4, yYsEЪN?$b 1ķ\3J#zfH {q Ma XNp$4ϥ B[U[O7WH %>pZ,'OBpF2kK1!`ǣw`eTj]ք`.`Ɣm?sWn7o"*]jP3]۴ܶoP0 _>,zڻVößf_gTnQq6lz)4&"o}r6UQ丿t\0fvv4R7.m}QaY1=UF@-5բ^PfzY+ kk*Δt̕5<zYu~Ȍڲ!,2E]EwYv_yMfEh!LGţ[|L~ӷay5Nߦprvs:qSי 8aϖ"Iu{!^>On\?yp+ig_is׾ī\m>L?n=&-mĄjzﰭ^S{7nZ.RH:4"b9~isE ,oneW}Z}_o*ۛҎA' I<"כ;M&1y_6j Vl5"b+&^~-+h衪WG_[-=yU=͖!*棧J#^/BWw8hcS)|hYfxJ罻7/8섨\C_[>P p?ZLau?ͺېn$l?=YV T)P5Zj1`\j|wq =GJ^SkI<'% yb01&X&~F|HL`PIc4ԝЙ~WG|d);Y2P@pƒƸm[QRʜ.rnZ9GBqa^r0_dTP kКÙ 6&P= "߰g&ŋ=agoYx&Vs%-39ωp6BDǗ+}Ф_fEt< jT: OĚWOx/~;ۣ7՛Z$ZޘO+I# X.hSպFQo?EKm PUe' diy"ʼno0^y鼃kQ|-SRiwEP>L#ј||-A6 XHy}xFEԄ~gc_ʒaaRN'Ң7J=!9lKن k*rPU/HV5(2V~\ɭ&Gmݘ,sx2dƍr0#Be@Kƥ`YxJ C|{ Fg6 (3%;0Dz3gJ =Σ)%g"zk31ϱ>.g%`PߕJ=M:4TR)Q70([1r..עH2.3Dg"1, 8 32/7044K -scc&m 9R:d,g|?ͳԷAbag '4ɩ7v0`p8kBUPJeΔ8̬얈C|Hf~P>M& X7Dwd(zɌ@(QoH-OIb -᰿5"%ʏ]bOICܟͤ7&H.An:7K_7~6mՐQNCKC "ۀH98uR.I ).9ք;s/4z鲤Ra38(Fp&/6 j䝳P;1>7_E}wsjr?tJ׏ra=l]$ ;0MYoτ5X&= G``x،}JV1oi!Ei*zEi8a68 c|_Jgr,M@6C6MVU!t8TCFuu\ߝʪ(<>oNQGj<}/Hs]'ߗzosUxBȉX'QRH$n>T+tm;)h?C8yơ*g +ng1|r&e.y]_OD: C($x޻Zi#XُcOaܣߓ .JTkıXӤ/X< )m\T.eW Z3TF9RiKq- o] Uʤ ҫ: &/ T\TY"q~XcCR+foT';d p@a׿[n{(zBg#{gZфT.r$ x4ׅr`7q9ڇU~M6 C|Hl_2y}ԀgF`ɬCYt#wvHˤH'?$\!yf.MFbw`oY~Ut)C:EsbxO3yF\aoY~N~X;]I\-\i򳰇Yv4FʘYDd$(JrA3b-9H})"[`Oq68Q.o҈ϡXEv0ZF䷧,8;NasW=pwf񢣀 |KU7P 8]|p> ~pPcNDf*Q޵qk鿢'r^먭Qd'݋0Ȓ#'Apdr iш\E*n}oHQ F-f'8F)j G>ǖ|B!՗IBJ<쏹Cl`m*|I|,SBMۧ"Y~ѽϻ`,Vgh}~U"JW@e ;f7*z T_-+.N++weAB(d*MwS%MCvMSV΁\x7PؠBySR<2~fPhׅRSo|!s<+Zo:R3jBH rqOQ-E|S&Z4Mm[z UaAӖBjVZ,g$Kˆ♝.x}kDa$Gxw: Z W %-.gŬr@<; RR#J<6' ,w2<a kYrP%$8,ZmvXHLk6A4 [pK?J*3Z<񢳿"^# `JDI$l+Hҹ˂W*RXJ`5l*haK$AQl;Ik 68/DRr:N>HL=2pDVJ4ESrZߣ+<уx]d,8W>*Żgƅ|xm(bO {Ιn"(֯u30h8's;%΃Qtia9[zD!.wkkn׽#!ɒW&Yb)n Z%;dZޥ V6FbN;T8ݬ, F׀BT.Q xK]Z"|AjX`{eDYcpBYvnuMD Q}>+ru]LIdurv|@ NXJsAvD%42Vm.®4ECT]B#38L$q5zWŞz@fdOE*0d҈[ %42#'^Դgl0VQ`\T:"4)qEÖ?Y )ӯC ;HKI R[ .D8kN.>cKhd'AU2bzRtc}50ߓa-r}cі I@fFfp(IVu|¼m$~#]S"X^*Uʂ kA/ޢ-;f5D42˝ ^]c6Վb,7r7j:i䂪ƞQ_/*Tum)Yֶq/r QkrT$=OçEvBx.c1uR10CJJۢQN2%"ar7ES5곑2XXGLeD2Ó0B\%D~8|^HHOĜz6>V ̌_vJΆ,C#Wh cwQǝɄ?,/_-|OQ FKhd'C˧8Q, Nc}QYrMTqnmPgFƴE¨%42p|1q M#Khd'A;a76gMV83#B?)@w9=_Z4jF^pXV:7QÕ .ĀFf|n?]ٹ?Ei5j574"B*+6}0zf@ش=-\Xt3Q'G]g9Ь$֝x+ ތee1hф*!):"tD Siv $8^â~-Ic V%<ޤ7Sc'͓MlϨqO;x{>/{{@ZM'{nm<õ! M \?@Uo15%AeG/u{΍,s:vxsn|BDdOO0߶^~mV_nc'x>}l5suBM{Z =;QMRh+$%RWqRZV [YCJ ;:۰A7WZNz]m3WIOFl2ۏ0\-(``OG/իOܸ=|jۍfn9s2hVfgl6Ƌ]jM?\KgzgfaQ0ڂmh(_67%*I{nlMg^oaI2\z]QxdV+ 64v:'f2<~ܽvs/~]vo;{mHD7?M'h>}c`yf⛫1 <$D{Y61#*}hPĂ|i/t.QE=atvbue#08\^[|efaɰ"؁ᐰ%3Pʗy{ ,s  M 5AFi-!a_Z+GHy ̸,f[sw8LI0@t(/V*DQKqTUeq%q[y-`%[Էvyg/Lpo,m!,l%kDp@IrVxOw!V)h8yr?>o d&,.+q#TF4#ciP 1Hҋ'/Awͬ?])PY2N#Gj*s ,i'1 yrhhX`hʞ2)trg*L`aaa,G ` xFǕv{N*/`3>tɜ6p4Z^1) U49Op'bZa%ZkWJb;VQc SaRw{? ~:<}8Λ`1gT~O~}AF;o^?6 ۂI9|qD@Œwy?88j іr 8L#K(,ؐ+%D>9/\L?(T4S3Jz xF54H!Xk0gӲWBuhDj TJ%[X`v^ Y_45,ܝ?S"cS)@A-猉j%DI*--k։)  ZI)8XON>߭7W3u4? ۾'~zMC |VVh偛]G n/~ظ.^ov~ {cSIFZlܸ/?ܒ R֗c˱9VZI7 ?lS`pQ铳`Cfn5N# ؋BvIfR]PU<.`cMI~WԤ|gVIæ,pYb_{3Pyڕ?OuVT~Ը@;s`[otNؗV}NZ?'0<8oq$;?mCls7o/k:T `J]2/X4Jh>ݸv?MĊ |:'gA7]>t Fz;0#!ԋ+C „zW;j8+oϘ|t{ر7;;-"=]?!c?h蓌tx3fg^a#$ @l0 q`>[W j=̏3~3]1m]D7未pڅ0ShϣXo*찣 N֕~I.)u(N_N aEd(AoXOtkP umWvJ!yw7Yo՞k~j/ <}'پUʶ鰟4+2= 8X{GѰ`|@ ;}ygǿ?mΊ(|OWyyz:C Jgh}ghY@A.8.]5ǝ}w_. 7M\㚭v}[{/I)xT|;Ť?epNORf'7WQ t4|9M{5I=ԨjүEWIY>8Ol_Y ~ѧ<& ,U%&.'.g_QuEٚVN,IlIJNoi-\u({dGs+A;ldo(L&>+`Xh~X~0=p7߯^)qC "4'rF}'8 91 (žG8˸Kc@"FWH@(( BIC F{P5 5hyksKT qFSMo8Uch pc{BU"쪎kU[*t G_ uA79oɚPONN_|sҸΎO^vӵy30}_e5qAP4r8Y 4O%f@{Zx9tUY$#&}7,%Mʢ 9U~M[]h(=+񬊕])kWY]UU3!Rv Z5(X'dG!eVr(TXvn%Rm_a JDz 'EntSy`G!rPܚck՞ *~ 1V~{!*JA"mC5 f8@k"6rjۀiǘh+(U|G!v`#]WUf5@> 7LZC9u`%u[Jv]WU!,vv6) ,=7a~ J4R7!)6/^8ym,M۹\Ho=Ғx+Bvokcps /2$1gEb!W~eN72؝MK؎+:/:i;Ps}}Nv[2L=]gV^54AbHC>]Z)k`]Lh Jd5MN5 !Ⱥ|j)9+rHWc#"U~pk Q*9LUa8FhimGAӐxF2h c,Mق9Ǿ+k@f)/a4A/0kGjpZ\g&pڡ^igKTA<Ո@gu\Im6x=ImH0Cmr'h8Cm ɡ69+jCmrA @n29 jCmMɡ6&PjCmrXm?"=fr!irMɡ69&P[@bgr {&mҤ ړ4v™|l}?1 mP6&&c|l}uRXL0Y8 ГJ38f4ZF+򬚿4 lȦmcj(ɝVa3y ng0kv؟ *cй JN]GoxKC8k *[ !a@WGr0~u;##زX}W;<g%w=;?5y:am6I:M'}0xJM+gYTl֛OP.hNſ$Bɣ].?ǭҟmkk3e3Ł9{ճ -zz2-ɸrj=}ub>CZPK)kzdST0+F <~ uiֲSykdFRNy.murù䰦V*j1Zqha͚e/hY1%},:R,^-}mkhvĻKj]]{Ik7T,Xl.*ak,Wjkz"xqAIѐ]wSe1:Wlm15 Ԇ9X>{#6׼$whj~7LWQ[PV nۄ^}96&xY m6VQKo[YXuu׆UR0w: zV]I3u_qAwEm۹ͳM¦3#NOH3LV}2$Q?bߘMdR Q T@ch=cͶQcspOPg qqoʉ?ePg ez&ARdrw'U/-Vᚺ_ÜRL!'O-%<`x籕|WV.O2GR!R J/tC]JO^hWT!gaAӓܙxt X v[Ѻ)<"!ëlހc"89r݈{ '{mZٷ*6[5o+[^0*,A+ZPV?%ozd_ʹIsh= "'X/0\hLj6J^; #ҞC-Pun8E/*jV}<*];b?Kw+|v+#{uRwSNr1-R=_,="zRS,t rd|W3{ъ+ ? Qޡ^~D`0wQ$iA?8ȺV,:-d}7{6—$@-d6u1LECX`"5h9At2]F(`0@g~(|[hxamb iS6C'ܓ9ؾF m\VD `Zcԕ\`^-A ]'`Dz0۾!EkB #\PEv@6/%#q:-A<yNDylq]ڒqBp[M| 6YCqġMAc~HAicJjgksxt`q؄{O2FiA;+aNcicPH$#;䁠1Bgk| 1DZ&8qY IYh%Kp*QQyӲdM׼HDjJsSs?k^BߐUDxB ``q)7'#N Q@<qRB qT__lH>͸oa;gZvq=W%J =YOjVJ:F9v`_cPދ׎j} GntHZ\K[zd0`$ԧfdK”"G6oHY1.L-a7L)Ub>Qi \N U Aîc4gFRPu(Eǁu̺UATxF,c2[ƮhOceM>MkDb́< XFU  ȄiAT%k!Y24 v@A T #xC@|+ 046HD@ jL@h<#:VЭG4{8Rn0Lm\O3h,AI!02Uo1Tܛ2Sj+QLL!ޤu@b:{)pmwX֙%(Ezd_H_u BR *כЕSBTﭬ da-b&#%>ko>$PH`$g!1#p Ri ˦Xbn(T2i8X'3r|PǾ:'R:FnZ80oѺ~Ŏꛝ-jHN ~v BаZShϰO_].Tk'-Tt`csNk<. mVm@-$,>&| ,:pmLMHSIQgP!EW@ja1 S ]ANfBBwE)y0 4Iǫ'ȼq`A>p#iϡTu3yτx$,Q@Yk*m6Hw!3pGSBV&;JtOQlU'E l'ԤAes#v~@hv\ *^2>B]2lرVޥ.Ko!1VmDy-#i]J`(D"tEz[y A+I'1, F&2t6hqȷ`Nc\'&5j{JXgM3q8qIi`vO=X[XA`8 A&ۘP P#K~\.nlIy~T]bmюaL{5giK C36 3|XVȴu+0`BZdy K |~)Hnj|dN [gĬ%L%biPNLb%)n#O8kek̊J X=f&b!ٱ{&Ik 1GP$6tAbIC[`tAi*_JB@ޑ@ފ}NOl0P5zE?^o~߼ypn!`kMP`Gq_3W7v)f+ɵV-ugWק*gk ϯ?P잽[p?߿yg\{ˇ}i 6mx޼{H?-;$_|qv<`w!;MuK{_EEy=#p}=?$\;je:`q s {y伺y*%enI]-Z@&Ȩ|Qr;sZmjٳG>k/>,;]y3SM|m~.\0Ko \O?`~؟Kಯ L|X(^T3BU7ܩXS/QRBط֩V>::J渻|n'])=kS=# ^p]+TՖ gc γ~woyYa+`jWr,ؕY=#p>{[N$u ӧtf,(e r;n(gehzjM;F|=ZćX֭#pAT vxg vN5F>ȧT#jS|O5F>ȧT#jS|O5F>ȧT#jS|O5F>ȧT#jS|O5F>ȧT#jS|O5F>ȧT#joa vpv;jR(aR`1†#.v/t aY u,pDxS=#p>dڽYO]G%/W=% aT ߗT#WF}櫑ae(X\)X=4y8gw$:] UZzX}jû)q%vvQ×ratP! 3%cmGY/cײ[L_v ӳSQxp4Ss;ofOsq9Kv8Yv|Q|o>|({\ \ײ\e͢A՞g·v;?yZw]UՆmVR/=#lqyRԪ *b.ر$.Y=؈Ai]8;/jkػ޶,Wcrݙ$ nt:aۙbLEIz}m%bYA6ENU:uo,MeQ_X ~w?ܷ̹mt f$>`c+b0+L>PrGܴ Gt%`6u'|+Dl Qp"!ҕpE-}`+Be+<ЕVs])ܤexWV<ҕhi _ ]!\M|+@+;]JI&!ҕk] l-_:WLyDW>]}U3(- tl芯|@O}5]u\KWJJCWP^@W<զ]OxDWX ]!\F|+D+T Q*)nz`˅7tp%XjtŔ D:@Xc=+,BVBWrwB:@<+L\UgK}+Dkzv%4xW,ҕZ3]!`O0p &DT<ҕB ]`++/thʾ%zWŶ-'`e&D#҂%o.V Q.LcKG,VR+4V1ۄYB!NQÇr$j6kF6s9dପdzUB)&.VpĖZB+*iݛouoot!StJe4# 1iqiڽiGl?luڣa?2+Ǫ:h88 G^V!_}ut/F Y~~coȕ z;׬u+ЛkP~iYF{fd.u_D'-reKuRf#"O|2BBţJ7T[sd陋FzUE4t(>{h9)O(o\1+C;DWJZVbrJho^IxCU &5񁒞1eݡriV,6Q^ .f=G>TtwnF2k.t^F|o;jE."IF΃na*ham;E NIM -!t Scn gцqV[]-'fU=76q T7wZQ+w]ѫO!l]+|vYqAS C8pݍ Fy2MY޺>V Mݥ]Y˝7ǭ N?ª}L\ʕ"Ȯrws32H.^;ҨbAͨE ,EoUbԮSH3j|~pP`5 9Xw, >_SܾGח#vDɊIS JPKɛjJk|ʲڧbso \MrD)Up,.&̔?tp7-?9AjЕXŀ4\:|RSjgeNh;<;gB]@Wv=5}t8jFHQyZ}[;eI 0۽>opD U{6*Z*[w5׷z ~ OX|sDe{!=/p<'.liX-#>û7{/Jnmedu[0-L;,-vU-:ٴ^ (|egz۸9@ޭ`w.p ʗPwχp"F%-e>'%ThX/sy/y2u.՝<{\Mo^IE m3?Ov4KT'I1u'#"͍%ЂN e"&y&¨ʔܱ$5> wop`!xw5nK5mF\7-将QisVvgBg dU}=s3- '?82rQ0?P~)e[(,rڅ,ᤙ`HKqs}ѿ&0ff9Y91,9 6+Y6JB̽Đ"FLf[fRmgM3mli[z#Us2=[>,Zۢ8E 76[k5Ohֶ}ѴŤ^_pttW2S]7-9sx:=&ے,[Xnn-_ϋ4GrroPJId4Ģ*,,vfehnWU"n8>Xg_+#NZAg>9>L'V ~ O>+Jj|Dh6|t'@fgrk[7$['19|qwAfHo^W ھj9-<ǻָx;;;trfr |ݾKupZ5hGpV9B\Vvkmvj|9t+@!%!'0IT]˴f/V?OjU/]s/<OVOr>ofuxkrHd+k%ďGG8~dցZB!|N Tɇ3qJhi˸Dj Q9υH,u<oF">no[3XE{bŏCFmusHʯy%9߷y]$'S[;Y)C'#0s7Ũg;pr-Y3cx,i3UI n77_j5RRƹL2nBBPUB !3F2Rk gZi‹u"z~ڝvou6kY5Xo"ӎCVtVȖmS3=xk/o,f|\ wP'58:7,lB]y_P<0x<+<N#l?x_#mKX#G>BBt([$Qp Kwp5$GNWC+IK[ ]\::5%opJSǻx "Z{ Q@WHW !GtD ]!\|+DzOWpRCOt?tp ]5^ퟮ:RAJ'ʖ Jڴ)* 뽿W1_ wBV:@bXuk gǽ^fSUoE=I_ ȱ&yFqsPLN7p7.TC%CC9'j {DWIO܊8Ҿ<ҕ("Tl=ZPC, .I+DM+ >X0 ]!'ݽ]!} aAJ Wr_ ޿D&HWZ>du̸?]!JvW"]i3{CWW{vhR;d:D<-]`?f0@)HXz>tz5 RHX7>Ez/]uB]t =TR6z*jmKWXso  ]Zd Qr(4G4Iշ@v^Te>iP78~De^c~ުj%OX|RN0?9l' 2 ck~–u )޼Z렓״}Jxg7e[Y ɮs6|-}'ҵY*mAbyĪ&y3bmRBoNFLQ%G {W _"D+X#D)eh0yDW0o >VtQ_ 0HWBi.Gt` xWd&ҕԄ崆۾gIRaw-ɾT'~hB B-"].)ӆD:ϡ/91L[VL\y%~pPh a?)[B>O}iuBiIJzO`NYEBWV~Qj :+,~y7ڛ^t(J"'˨Z^e簌ڧeTl<9fZeTDS 3̜0Y2io J ]Z`׾46 tu@tEl< c?4LcKG,ƓX"MLl&bg q:11~~#;{سha<UtL-j@&\f)2ibTR[_ǖZTg\{ɭ4Wݼ ,|Sf>:]" }_JZm*lڶT^yUU\Hn}׳+ 9JoXX bfjfh_c.|kmcA9UvK_ݶi|av +lt7˟]o:^XIUgo |}:n9> ??Z ڭc}Ɉ ح>\;gWU櫞هr;O򣷳2% > |vvU?2g7_hSGN )\PM%MWotbJNOKt/>I蕋pfkjO[xcXٵSe֓Fϟ6DzͶ'>||'3go˲ؙ1Wt< | |zCĽ]6*՛;޻+;Jv~>>\8dg<6c>Iɠkە:o׍Q0D?+=[w=Y_>4*n׾؆rWmIJÖ͇^@yO'bϯnOR8MZ:WP'm_! mw7\zL˯ԯ:[ՔmX u{w6u@f9CiN~xv+'<w[`ecF^h)q ʎۼ8i#>abcexG?əOw./ vh3dI-]ϣ0؃td4)3's7vmSUxcE/}0㎤kw|:ʗ) +A;6_9^}}ne yl8b^>/ڻ}޽Gozڭ3[:>c n7$ U{ttJ:KGy<K%_0-܄cmÏ/* MUɖ%ayB(<}Y`1 겼y#e1hZsNjKf=@(oVn5ku_ %v6+[n/2?^9[-t\5'\!nw+Vv+ՊzUS|>J''\!)v#\A7MpH q`S#!p{ ׈nv_d S9}3xVp~N͆\R{Skܡ㪩d1r0Ҋ] ޛ`\  r?# b:h X9MEn*zEdTaEn5eG`M\5zUSq$hJ{WLw&W^p}j*pu";H0 eU)ZÇM1 iWle?զ\5V:J7}rX䮚`]5+:J=]#Xj gA}񽲞rFcW^XWMoq`S)̈́/WrˡBcn}V`\ourOKTúivz+ 5>=>5PN:F\)eQG`-U7jrWM2P WG+<; ؛~UqWZ !WMWxUd/jj_{ *oaZ]5M:B\Y +0 W ZWM%M:F\9RzZ׸%XjrС㪩t1H%;Uܑ\\/jj@O:B\yb{Zj`I|3n͡ *Ձm79R[:0Wng~~p5LyW=-R~AJ pKCZpr}/jj>t\AMtY{0KlBa.^sB-E/[qл, Y'.DB] tVZl=F'覢irChJ/+mFGjWOwZ:Ҫ WG+ew+WST0W|7fUЎ(m̖|3$Z5iep8|wA~{_sߝ hڛU@-Y??K=o>?j: @/#_pyЈt~)Yr۫9euڷm:bѴ! ggnjOff>lϦ-|zo :.jg0SnV-[3 ?!Gjl^d+@(ʘw{n03pݢ/ܡoϞ̔O  qv3އ_OϮ|X-CA)'*]Dq.(cF_"p-ҲȬѓ"ɚj9HdЄ]䥾z{'I9҈s}nO% M'hj~rv/r)XULBxYmTFDhkKF#0dd6UkõsΫ"I.ZdDʐ! 6eq!ƬR֪,:mH~[M-h-4mʙ\"# ozsGok UD4TrBCFY dTJ BQypdK%s`xITB3wUe1xEe7Eײ^@NW/a6RƤ!pU5NGIQǜDhi{eXzJBX YM}cRGG%QCegcϐDDK3yR{oY,C@k2fv lkp MJ#TFtb@M,sLYetȵWYٚ$0Jp-<[D!@!^ӎо+!,qHR%ҚtJ'[4~:Vs\(Ql*I'JdU)ꌑ(H%)ȌU{Z@z23ϐTc%=䥩mR"v ESM 0Ǒ֑~@_+&6(-rތ)Vؘ7q"j%&I(B0GM(T6fȴQ,擴RNI3˄>$%Eж]I 4]tQ$l Xt=+ )D!; D{(턾rAvQVmJ|,KV^oˆ'rL"rkgkSNx 9+0OCۊ*e` F @dX56ܺƀ ڢQT 16V7D\XQi0!-}HՌ9&b 'juvF8$qc"\c9̡ " qbrEΡd8( dJPJiX/ٍ)-heZpUTI V(RJPF3Ȇ eBAi`-H~Lx,QAQ̌T_ [pVp2j,tL6a YuNk#1wlaPwqI1# 1yc5 0c0 Ȃ Ȅ6%2I$#)}-rUۜ015-.:6=G3^,P.mpd<\۪d`):Xl4r*=l`j1|LF"'bCEE ʃc$bB W%2NGE `ZŶoUG@[)(^8* Z3ܭNx,nEf60Ģs:Yة׏DA}XXż3r¶)겒A";Y0O[{~~u~/eyȓ`$&XT-VHcѣ. ћm^Z@mD%\uj @P5j*X X֘ nYD]h F;h1#,ut kVI.GMb6fThZA+]dPVNF e#sqص,F!tFL$Em4,?=HF-ČEED֨d%C69*@HY$QRE9|֔ ܚ5' LGʶB:jϢ;K4A8TPЌ PJ9S}E΅md_ElA iF=98YWSw+IqFX^KN`)k*BGOwFE,W,pWKYLؠD_܍QG ‰!7ґCv/ևz`H7LAS_ȶYLfI#ad"]$a" uׇ"uscaҳ1ѥŔ>47BK6-;T0 Oj(VkXIΥ CV,}Ѡ# 8c) WfX:"@xp/a‹`n$Eh\B!ț]}N څTup8v) !.=8<Ú"օo.ZcO`"p>[&c gHR(9N{rw$o+hT0!P8E b "5w!`D(t QցL`-p-h~qX:*P !v7fǘZ~O g ]:W mEq lG㜉mp&0 *ż <\P%1栂8@8} Uě. G6* 6@(ZPb,, Y+ ƋXBR:8/d!`a8 "%CRcOJEzlDX20P$w:̗Ϡ=H pYP1Ua̋ti P"t0hˁ 僙eM"9$HB!Lo b"%$ yWc/zgWxx>XP o) \5 B7:q:^i^.,ֆsȥ-[Mǖ -8 "p|.EHEJ۸Q>UfN: zeרiÕ67m߰]4o5mex(7-pV\q?R}@gu@. jǠQO=GP[=̽C:C:C:C:C:C:C:C:C:C:C:CS:QW!PQS_PXyPU2z_umu!P@u!P@u!P@u!P@u!P@u!P@u!P@u!P@u!P@u!P@u!P@uYYP}Q)P!Dr2vYyP!P繂:#P@u!P@u!P@u!P@u!P@u!P@u!P@u!P@u!P@u!P@u!P@u!P@u8]JVP!PL:oʃ:!N<#Pχ_F:C:C:C:C:C:C:C:C:C:C:C: Թ]X|qW3z0:e`4`hy"pE;2\uW+K\snUBmu+PiOxd D +uyUĪ+Tw͕*ePpAX1W|0*=d9 |J ʘ+^Ue+]JhD\V%s+BAe+Tid6=نxw\"Alr'fS+1W3mk\12Wͪ f`;>/^NK˵|Ƶǫ}h]"!jOoPQft6C2~alE7_! 3fƨZНǏkg3M@(8СQY:O2<2ks" {o%rzEQB l'KXeՄmq}w\nj$[:oiBC4OUڮmg7j 7ⲭn}~:l4v_p 𺗞:Io_OZ㤱>m8y*:ZGXDB3 ⇕^mm/MkV;=:>\E{ʭ!6Q[o-AKWJ`McSj.ds4 9 C9nE#CS;\h.+?*֫azq3Un O`ZN}Ic ^B=}(L0as(Ta4A})G7[o4Й0X4w½J,q,X`v+=KAtES57-._^}Wo 4փ?}7[GUcz;=y &x DkQ^pxQރ߯iOZRZt}d~jllhqO~9<c0"8*̱pe[#qʋBd=Jo'v}%hxG[7hn[[G֪mJfY` Ȓ:6x>ʒ<fQ$ <X G_ __Xt{_Z'oe.n^8aeYs#06|' 4`tm <6?l֗DJqx5.~VΥ5 Ev( Q."!Ҙ* wkopܟDDk?5hAjN}׋vs\n bwn22 .R{V_jāYmSKOzŻ<<ǿM g_gnf\]ٖakp]unhaz|BxnX]k/1^%ٻē׷29NŁ+ջ?{e &.,rS3fNz8([)~XK–Z~Y]%PDef:ϲw6ҲMT%S7&[Wb,3{їo\h Jå{ ; (NOˬA>Hۉ54ǹ'V"/epA?7;J>1 :|4_WlC>iAh,qk/?1N2KNd&H+z4`I覑q;hAXl\&麧y|26)d[@!67]gKi҆ gs9ts#ԺSL27F n(sTɠНd2YF9ĉe&NB% Ǵ_/)<.3kPP_[lZcMUBbm|[$ YP<`B0OA6 尥t n Bk1<394‘<х~9azC{Fz~9^)XdvdU/;nz׵}զEڃLr3N{׆6ұlBW6eTLG>~I!aq˜Mkw\3.p0_x|[W߽^Sfy~~xܩäDj}JJ<>)1inoϴA2p7}=6ỤH2_.H"ցmٕaK OW[Qq~kw#Ͻ 7q:v9vh\Kf̑g[?iFQN࢐ f^e @ J#H9Bn1 cP6쏃FvQ:u#tܔq7凵b>n6/$ڒkkN8n\{D¶ՙHh& Γ$Ň tʹaFr5RY|8wQߺ;Gww soQJRJRVs=p?"Hx96[7!m}gm o)|7>AϟƁ`zcU&r]#uUO=ʉ)vJLp?*d@pXe* պ+TvJ\b +q2 Vg7TKm34WV\ᛕE++P+, U:gh|(LQunPsr,tȻz*\в'HIpV7Hbr`R0K.|Z 9Oa5Q4u1RHB ^3QȤՔ `Iz{bqP)p^BqdWy]+ūy[+d?Yw~1_ͣ{%_YSjg3d }ylzz6xXIzFxӲ֢wE\$ȭ`hZKtL+Ed:&Q( T^+Yy<䦦AQlPGX|# cd%w)$KhsvJ\#/Mz-bè25t%wuWކhc:Q@V^L ұ(S$ʌ(hБR :~[Hݹn폴ki荀ȿ~s=EO^6EyXpen^{v#IxʛErPxEj&uF>yh<OтvN<,YfKuǃbwӱeWXթWx^+h0itr{%sl(n4[#W h8" NXv1%Xgv恸wfgHߙRZy3Q+k5%J:㳨;_@ ښVwSģp4ٞ',a_qTC}Mp:'qyK' EZ8ǍQc |amPɨ`eA5xP0Nn7.kzJڜV0?ru(b6K}|xfgɜ?>c. l%໛xo ;vpl1V宔˜۽rC XzGk[6=V lV|ڞQÄשySz0W%mӾ[K| %ל<)\SշW{ؘ+nW{o%[yP]fw+Ey{ 5>tv"$}B"ǁGs&'\3ᯥ.!.i']4TĕAHa6h͙Brc.`2"Yr:zJs$Y׽yQN~B9EanqPzqbɓT<ڑlR³w$?}|<.utQ -KJB8K#}-!Ml1`%TIn<3JqA}w5i;O!s 9@fCL]?bԫ 3L ֙(Z@m -c(nT6i''c#%KmϜ[?f艊$Ep']2:-ՔX BEJQidbnsԅh->(瀤ByE4%2YR]HQ4nԻ9%Ciiz/Ä޷n#6wdް(A@\ڔOƨ$ H%Fl*KT["㶰2@+53,zmrEg ,L*@ERNY4i)HYp:`A _jg$5LYBA#5BLi’`3Ob6Jgyv֚8 Yv?p\'`MQX(+<,)Dy="z4@(\%4ڛ@&sP7?@M-K!0 _PAhd*PDe{7Or2ZDʲ(Y)&!*cI fGBrtf`֋ {h8ΪTk:M (e8+?1 e0$LRb@&yvɶˎlX~6Ni|ȡƧZ6eNY%'&V{j%6LPೄ#%Uyg+2ӹXGkru薭C_bv wWE-f!%/o|K""˄>d{.5jˣD٧vf/禞t7t!KIyn] $IY`Y!UI`^ (CA3XQhi O>csjvojDvۃρ OٚjA?saD˶ӇmtXk J-‹f6Qͣ("'!ZÜ '{ylj`'?Ovd'{u ٜ$۱g;aӡDP`;s+uWiAs?J?JL2ޥP`m;W(1]+ypR2aCp+Bq \O Ji.',xwΕDq \OW(Js.UXwAWJɹJizD2\֥ jGqW -gj7JiWWo+-aW .U+p2svkgWs{rK`ANO\FN WI+NĮR{ЩBr;W 3p*BiOŮRFS!B@[p:a?i-;w) =\] \qI%S+v@\E;îPZnPJ){@^ebW 3pJBiJi{DRkJ;W(N U+pҞ +%ѦCp[;W(̛Aڎ+o/bv m+:îPZEPZ}WWF KW(pP\ٙ7(W(+ K՝38{j\_OƨTZP*Lfٖ”Zhza|&V"W<\1ڙ7//^ƿ]eU鵻bm~>,pXvnXA8rQK$Lb(+n-s+,-QsE)w\5lKttt (- 6r_V{9O8~8x3GݿXdŭqM *(rTpk Oǯ@M싰0?E_u#@7e6V }J]l#\jl̮{F+V#ǛZUw&.WY ?$aN`؇u0GC@+oqƫFsD eg{X{I+:w2RRӓ $c*[b1j# !]5`/ӏeIpY2GPz 6sY#k] ؘEaXAkfɃԟx,6GlvQG}ݴ,R,*آ-xvY{5y8̸vqs sjsf`ξZ)B:Lnr~Ȥv;ՋoX rTG٥t;zwuQjՎ}e=:(x>ŝ_o˼Vԡ* {mp-ZCIW܊a24!޼n@uӡ'h/JMTFFqAZpyã7.e:*G[u|{\\'U"Z6H ?4~rK[rEn ?a(mcL7Py%~P'U{P];E#WV&TFiZW$eCr{S[aڎL KX[z'C)Y䕲7i%e68HߦABKen0DSMNoo/ =F'DaN8Gvz +X|~*Bis> ="lOc?SWK*fƅ_f^13Kg="W(v,׏gIw狩$7z3uôgpz~x4(]h@ZIչ{4(%Gs 7Du*kI;W( \}䦇 +a8Cp%p. \.KHiWW(.L vP\ݙ[ _$\)c-R :ڂ+e !aW(-εd9|HV#E|J#%Q L lW^B2l{?^^}v6]8Y63])gJqEWJuu%)i цitnJiڕzফ*]+G;Ov.͢+]2nkW]!/ZEj=R1̈́/oo[t`o4R\fѕ+ZFɛNQWptn"])0it\<ǵJ)tu W}\ lg\~{?<3nGWGCIw{<+%BGݽͽi=~"8yJWMiV/WP&o7\f/X,Ѯ_D鷕U$~&N+ŵJi}X2n:E]5d"]qΡt(4@+qbkVNQWbC3 Ѹit4eJ{WוR-:E]%+ډt)M+2,RZkוRƸ+cS| -EOZD㑞2.]]ɦv= +x~2`Jqe]6W ?\WJIa)JOn:A]B3JE,RZ֮+\v.3JSFW3M1֯]WJ3+6 ,onFWJ+_j ֮NQW}}+o NsfPiA[1xJHU|HW ,4MvnٕRtGU0w` . =3s!BJY]Wׯtuߺb,;."mv0`ҭ=aǐOe̲Sɢe@-z׽=²|gd1D uT4,ƸF)o V48gҕKFW%fѕ셔~[9E]ybil4R\;MvG:2MW'@lHWvp'U Z@Rtu҇ O+GWex^@6]XBHWLBfv0MvڰvPrMW'+t{` qeɬZGۙSUOgZp,+Tژ֮+P2mGWtǮYk]->V q-MG:3rm;Z+tu߮'\G/ᲛEWc֮+\雮KvK틀5JqEWJ˴v])6t}t\HWNabyG!͢+Ђwc4~ {6d'ҕFWh](]WJ⦫Un"]@itBqEWJyNQW1 4L+ a])-ǵJ)V  m'R\f eRJMW'+ʮ$Z0h@!(l2RΔ];NԮ~ŠRl;MWѕcۧ&&]-9}!n:"HGgp]Uv즫v=3&ҕG?kWhɬ^WJIa)Jb4&o-ì|E%Ѯz{ `ж4;},Yuֳl̸Ԓ+*wKZ_^?}M"tĉI[ d3.8F!^0=>^͙v͑>d?V'U|/Y^_zs͋G?]zqO1~o|bG\!w! %oO_rYc?|,$ 5I*'>?P!0Z 4ŵzVc=g܊R\;4&sBq΢+=B(o:A]yq̤+N DdWqO6]pt~"]ѕ~}6J)e;z6|ߛn+ap4Šկ])e֮NBW?$ܜFy }ͣogz0ѧgT^7>z?>(w_b%/8g7u^R?\{P;篺-}Kvr?D}Ę e>rrBd3z{WKƇho^ugo~,&'"o,fc>7K>w q?>GBCO~z4{a|aV[v*rEm5 #淳=e8\g@=9r1got{t3KmGBk_oO.}<~ |՟E}j[W/ ǒz^Ԍ,8ŵB$)TcQ2JyH7,~#Oh/gWܟ^"Q|rs#S5&ш-.0=2 /2dhou׽_ZdK.;RaBRΙ 6ks)@r\?H562_n.ږk|MTR35.1Br#QP@VcKIÚ( nN'"NܝSY28uL~F[[2:dT&  kbCF`$DmMs_zDHXsmHL6HZ2xHbu:qhCo0:4pR%1 "s kڲdhXL!AN(٥ =YeOpEj>X=e <$点 3٣YdQ#>`Ʃ pq[]t}!d4 Ͼ!VC,iS84yqTG2Bckb-`1w)Amc4׍ YZ{RW/ &CF̆Rށ1ԐEۊR%p[ e@Yf4\C75VPn3& fLiWr0vBcnF uZ!e ¶wՇFtHXOɅȢCa4 Oq՚JHn-2t/,,<:L;FRGbrLsI'8TR dgo3$f?l ; L. _רJoH:joߤLEs` 2JLq9&eq{aQ ea;!J1=dC?0}#u q R.쪫X03٫5A.$%RDS-fQh`#t)軲)$#;@uo8(=$lFZl٘Ix`Jg+VFcA L7 ^=M+ĥi& CΧcOMÉ0RȰ4&񤞿z/? bcsp.`r! J10 bxPu~!!K?tl*%_)]ͱ BMʄ4j kPHd1'~(AaQBEAo%#0I"sZym(CKwti?U=tx Ό+6#e [W1V!qo} U4E:Ցcַ>b{BbӽexaQMZ=^= v"}?MHv{//*q''k79J2rVl * #XFXw.sIa1^/sP6De ũ@ GQu R z0 Eݯ"< sFEhW1;6ANHzb!*lwפMR8h2&+xPbѝ@%2#[qkbJL˞L5PZbM)?#= Z;ʛ`j,}a"cA8P'YfL!(vDMqX҃,kN jԀʬfMi6Po zꭙ* 2y-w;Rf\"0@;T_Gû9SwmI_!.eov&{1~Z)R)?CDʤ48VYbÙ꯾[e0!KhLm~]y^٭N'`t PxFMv: s f&AYQuU  4x n3r0ʿ,E azZh֐>k sQSs5`PB"`:3Dsȇ醇Z5$Vv:(!ʥP4 w Rt R`!!UЁJ$!Jk=xǪ1VCmdX#2O[!"\x>W O.(\1;9iyM\ة %1<)U,JCɈOZCzUX@ h\!3RE ѳd8)nO 5XF=`pk&˃@&}*H#jJҸh O2w|5(Ws ,}`jo:k /5s;JhU8ZC -XNX0R ZH+B}~SpC8|gY28ZMfH=L%NC@ge pi0XӦS 9$bUmy4K1AtJ*IĔ H"` R,uP)h J.p]Neg, B= aSp":'|*`;7 ȡ ɲnKme P[ 2x|Ӌׯ׾fVܸBF( oXLzjڹٿ|8R' &)]_9[@F#`~K'E3e<>:O~3OsN+\ϾlgC0q.ǶKDKJ']!)YpG+Z) u^PGc兄:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCB^ u`'#s}o+}+;ꀕ+|B L\P:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHsBw$ԁy:`o:`TG/+&Kx L:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCB}:ނG:b^Q.@B(Ա'P:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uFsww_nśw8ﱝ^/`P>Rt>-lѰ}ꑮ LFW ]ZJHWtEqd=+02D_ ;\R\@#zWhg+0ײl^WrϮ'LHͳA{a~WY\{\bb\\J\=빰Vn] \R;\kWBr+B \ZZi< +)aGp[Bs \c+R\DRIӧzWh}+c+RsCpJ+!\/p$h9Z^"\*SjGs}oA]fRK+ҬGp?Ԡ᪕V,vVJi ^ \9-\GpZkJ!iK+Gp?d#Bk׮JCWjϮW'Ϯ3g}va>u%s\f=2:CL W&T7p>;\Wήp.Yi8#q>/1`d ~ͲL%0Oa.K%oʬH=fZnf Jq%v\_5Q=}φcKWN?Bj0}?Wň3{7:$?pQ+kQƇ\G/ԜRQ=6Z}5 W|wUtT&}.7~H}@WעppUoZT"ix.ǎvjſŢwnƗ~.XCݬE|Yvzo.#45Ņ&^Soǫi[e3zKS10m|xs{~pףuܿGKY;w>gq;=~6.fN7M[;[mNr'?'ثݼzuۼf'P7:G͸lb8{p/*~)GyYOir=mT=$yfp2&"`۸^)|_1~7ǓfsesW'h{YGM"|+:x͜mݸ0<|>l2n O3;BZ`P]]٠?|sZzVVmC[22_·o/|u[Quэ=2 h޶`itkcו]-1:ezڶ&r5nD+l4=7}m!Wܽr˪^oNjMz8~Ru;ak!õt)d8S|N?r훇?OAn8[gqE{snws`x#&tݳhtiMjߚZCw-DH2&mwW;Bj%VnjkT Rp&,x W}&8]I( JrVTy,D9ϸ̼JKa4ȀDU U<$A摔f`r^|`C^Aڳ40n)*m?݃/vj*sC Yrh:}}rH'Lj4d#}ܒP=8vLJLyt"όJ$]}=[CQV{WԸ-ngEFU.0kXev3nYY3qnJw=r;&m|ifH õ> a&~l|~,ES="<1Lp廯/9:y 3/\ѳS p¥o4sA/c N:$U]yXqD9H=)^i6cw-sڑ줘֘]6q*RIdeu)-Řf>`JhB 9&u|$e&upC)\pJFt .JEho01[k,<͹Ş]ָFRnFv4nMr{>rB)Z2L Vh`k,M^}nǏ|j佣ɻݲGX"YxbX.kU!kYQFU21Y,_s@Iko4it=xͬ.]Wira@:P :ts0qXeGQwxl ]^'u5hn }߉"r,^kUH^32}&yg M,OHVp+eb*Jy;'c ^X1Qx ~>( ^ڹܝZ- ]d,?~3x=jp<Zy5QmwcEcK}Zᘺ9d> 3EЅӣ:[RWɭN(E{W0΃s+]\nJ[mWs|֜aY[ĹKVHA['[lu3gbz8 dX*Ի2L> RU.ʂ1x2qOebjە%Sonb] N3L\wHe: i'Ĕ2K4 $J y!e֕j1_57{~{gmpfw >;_ЀsZ\nhu8;ef):G-ioM@=5>#y7@XM-=[bIp PoӛsHFbVB^avcVK^ڱCdofR@`qv@/7V֘G( l+!l03sNdy@azҹT6T<8 i3\S6BbDU W, {i`Ws:cb,x8̜==;<5SAAL1$8y B 3IĝdY׎YP1 C it:>|+=p?GJ UJMV+V. FYi\GKT;f#ʓ(OX{QȺ* 0Dp5aXd#Dٻ6rlW|, z?|7[EP2&s)[oMCߣ2~$v[I:ӑtt 0CpYb#1ωc)đ 0nO5SK4 XfTa,%*jC-t1$q,%#$UvtY朶\0QV|KyO,YcPY-a!1D3u aYD"mMb"!fEsXzYюIȼ!rA/n`5ac;Lt %Ho l!؆ ڣ ] c' lNM~/X- X #AyBoߗ#.&>"cB) աQyra0 !`3+(?B" Dx_(vO ` g'$Dp6%|4%ъR>>n>66yw/3 zvcY5yDQKFQhY`"!Vޏz a `tp7֕C}ʡCW|ʮ[HAWv5`jj&u"]͆ȑa6g tsKw+-t2y6c^x߶/#2J-n?>87VN[iOhsʽzyQE݅o ;7&حk;RzRNK彣~!l\e~UN&M'Somu[sEYol/sP֠ԣiy4(m_n7oWmRǚ^Zo/x3͹t.[;TG9`S-?U/Ա{sPn;w{7}Y,`Tߡ|+`tQس~6%Iy"@}Ba. YzMvom '7nO\dᆱ7Ս͡p^[| ǗQNSĥ]twtXn.c@=^{5R9Կ`sfi˫J;=t@[~+V7kw-սlv+J]6n3-]u io֛qkwŴ.p_0-通_y܃(J4vV`J)?OJXwLK Xkκf: DtϦ'$G~unaccݓҚG/7tOu O,t t" [oB,bHL1\>6uB΢qL܎ )?W&UŰWm0zf/>~,c6AeS1/v?˶& U(w ̼1ۮɤ>G\YSH#8& 6 dIPQcf,ofk(J`MKὖi5ݣb:?iWKRxRie<ѐtzT?Ow #ҒsYpƒ`,5*dEV%:֖)OPHGZ}4B0  IXjʈH0тX 0Az"ÓPJkChT6Ql(Ҙ1IX$a…S7aV`z~шGHYRiLrʞNP$ax(6㉘+!X .)1Ȯ  *)@Ô`X0)#0JАg!ߜVwNa@ܭ9Fs^f}xW?~V~9<[8wć?QSMDK瘟TKR)ː"FDA`3h|aQ& J őHM؁.jӽaekE"($"`бa$08Έ8&TTw(?=TwKhEgpcTߕj`;J5xw*'{[mS @˛ ֬dڠ)5` H XT!1I,4e Xxeķ{ )Q`3E`+H`qq5E` sQbpڱK@~9AiyFdh\+j3˥ VB-t 5 r( 3"4j$4&wa`<l+g\Oe]".<wAѝtS%] nA >׋$y|'V[ W"fܐdzɤ*K*0xY0ULq+)&/2â?qBgYO]; ~0Յu8$g\oݢXڸ:n5鑰0kZxy!H7i ϝ~*2"鈇93mL{a${M{ӡ:ޛ%:-%Aѕ,7t* ]Z2lPbCH]9yCWW# ]9eJrZCa=+L^ ݲL-řJOבd]`F7tr ]9Jd%G9]#]I9]IL׾Еt,t('HW „Gt)&ЕK/t̯]9tt B'GXIo bΠCK3OW%wSvBh5֫fKѪ3>̆h}6Y˧%f+ռ]I<`I7t2 ]9;tPꜮ֒cs^ ݰGQk6]uٌ qBj'DzboO{VRJzAg\Ś*KUƝh5"陾/؁ca\ggq6CJ+?2Tzw\P]_PoODIR,"a.GĆ0&Y51zv2O,sQY0tH .xU H{S㨓u,96QjI(-9 R$VE:YKNS,N[ E\8Q4@Wvڎ_nlmhW-q`b`rfE5/-A )PPBs̠LuԩiTRMuί _]߿{S[;Yw~mvK3hu%jEP H  F hʄfg-76:-hvsZv@M:ώuگ  m d>8쁌ūW Xxw 6{AS8t a°6y+(B75Dž F. i;t4vha 6G LkU7Ş}=Tzm -z-kBp^2?0ҡu⥻*atJ#sZ+Hg#Ҵ7aqtB/Vsx;bU$[Dz/)x ӯ BKU8 je.0} fv 5?FcX7wosr ! "Ng W˥Z\+}PUݶ5[nh)硪U5N"D ,[NW}?)f= /8K>YdKeN;b܃Z?|RI[`R&]Y&xw2̓SiUS##M8GBB̆4( j$XBH4ΝrhhGsϬdEc(>OɁ7AIv/{z.0+Ǟ`=ʰ.|>@>049)BhY{alt>#`-OnV޸Q\/n-CYwv(m7w~7jJ#srh:]9RttŨ$yDW9,rp7IYʡ;T#\PoUd~2P2ҕ`Tq]`Xb ]9,֕C)rjJ2]Iw҉{CWN+Vg޺n\tt8R^-;_ʡxh~֠+]%*INQ]Ij1W;KhJR|*Q:W)+EmRW@01,o/j ?zPI;tuJmhrn#WW@-%Go &*SLy|"XC G.GUWQ+IJwS)ՊH]azoJ uUP-LSW/+ uQWCڴwUZNIU"X֨+ WZ]]%* r4mZtȁOf<*%[ r7 A$Ϧ!|,s<@ѷEI9HifEmxy_PUR3P,i?S. F8cRL#b3M;dȶu!6TAS˻zi9l}j  Z|̮wzŭhƌ`a-Y45лB'$*X&š kƧo: .ہw׋f.^@&|^<|K3*cRb2Sм(Y|Wanȟ-UYעTX<~V*4)*]o0̇U;E7v>[VsQk,a0]V3~Cp/0\] @\v =k(0nnb%o: n]mn>&Ex'S&gc7nYe0kC8ǒE*L<KI S"!&s$#ļ~IED|[n2,,ARx<P;ZōC_)ocWaԛA?x}{߆A.|lN}97ޏFO Z?J5625]0X szB XJ[X+Rv|TW8]fЅQa̬h ݐʤs\fӹ֔zc(ﳰMl^I=7O/k;֖- $@NaZQbr=ߎ \\Jss J^lTpp1\9y0ʏa+0 Vqi> NrcKEwCR-Vz*dΪ!\6K_Ò餆tŵP]|8]z@FzoF)FfX{cL?Uf3a#/+hTjYTq df G>ג3Oɝ:Ͼ(Oy0z37-΋7[~y(ӤE.@05.\۲풨G풨Լv9mAzO{rejﵻYb[ޱF7 IUp;oB!gh՞2#z4‚Sw&MܭrFniUro[4d s^g+y!3ym^~,Bfz!X|*K|^n)wf$Rq$p)$_Җx* 1 ɢ5*k:H9&Qk+i:a.tc rxrH6Ǧ(bO.U\9F~2~^ xlad-&V|WCvn:ƠĩRɴrzD9iL8#G,0D=6.B'X$, X HlB Oc*hG,<a8z<ۊcLzcߺ R@#e0 VCQ5wLd2XZ8zoFkיL?77ӫWppl˴na]Vع|HBrՌ%GEkdbWÂI?%>b3ED=_mC {y,%_0  7zDN?Nfr_Ztݷf,KIa\=3cE8A$Ŗ F0󲞔!z/@ZvmlK \qgחOXg.5́=C_$`A'GeB CSb޽bƽO'ƅ{|v=Vn}:e1_mG%l)à a@{NjE_|п?xXʋMA=i"6ziŗD "ҁqX9}KQY!꟎0|))LX>c,0y;UIH߯Xx`(J'o#nH1 =&a(zhʀz|~0OȾ[׭푦xbA;Z# ;7un&EHbYߣ"M^\[B x@% vŠ5$6F!p$(&Xd\b.,6 U23mT K@y.|CY}$Z+h4ec(M ,6ӧ-GPLyc$ٜcזti2ҘmbRDAf#N()ya`"ZA۠FH"I`ci9I149hH:qT# "Pŝ] L{BL[kX*NŽPS/"rypzԈ C<D+Nrܬ$3O0N~)YNIP*k$Jg}&`)alrfP44]`.4+&-Ȉ&*H{*@#W A*!lhڰ5FΎrv|ir0*>Pof)" &7aEg)Byd5 DFN/+E>0w.*%`M1&GPD< =TxL1ZA-7vvRg'Ek&cTBJ a5h` Msf^y+wqT"/NZ E@0fTㅢxh gH H'ITtm)0]D{HkE+|a[貃xQCpḧ́uh0YD,!,DOEĂc 27|<hC+a鐧iaufxfkiTEHVr0 " ]ٖX!nuA`Zs]+juAZʎDeVӹ.s$o8H cVYO33t_7T%7ih 5d(§0Lw Z_f]5ѓ6YQ1G EmʏՑaY #䎀iV 4V_"H'MOгo_g;^Ž^-F.~]Ф<A2.q\.D a* ?yw{[y_6lg)6{/~8<~vg:Ї4g*&k:z%i (꧗㏨pﻯwW3GO]q։\fN5)6Gj)n=a0@ AZ4mВGK9k{]׵$8]"$I\^&!Jj l* Θ \HfBUkD m 8UZF2.m)@mA~>ȭ$7BSf͡$OE$)GCu^j@Ldm)zx`X U]E>mӈ#9jyx(ÜHSm!tĽ։jݲ3n|Һh nkoG Ν 7OO$PuAomJȠsTj/A^ƪ5,WJIF;RUq*8U` _b>}SCtgГH\=:Bs7r,Վ6SKf}U_B3F*^Y>PEN|n՗Vi+!+} OUȳsxt1ſ2e˥tFy FDPRd℞qR^R&ƪAJ*K8"UweѸLBS]4)UMt;LRLlYHVB줗Tj6vO5rTJZiQ89c`;Lppu]߉*&WPMvO4U@RJFn`7ހA֌v]ov z4n/k<*Pn~]_jgA۹zMԿp%_ܓxNҬkt#5"gh[0uL%ȧLS7uݳ>x4B ^ɥ|M3uJ ^?v֋<ul$MK$ЖZUݹͼMcs>Ш<0$.[d*[]×ކ`} xq+[;q;d]=BC xoEȅ״9x`AÝ/;jC/J?\ qO΀J/O2j/.vR|ĽNIԼoe0`,,>s;|~؅R}ޏȿ'`xſJ"O25tdsOmJ#PxWt{29עɒ,_WAϩVy~,21kLzPij hAf0?!H6 ٺ r1`eGcXqjYBz13u|ΛH?7 |+;nX荖c[aMSq/itYC=-WYS0§Bl\Qqxpi}p%5d9ގyِ#WS{SyX)!ɧC\VOI,ټdTOԔ+zSݵ(.?+.kH"XdYXZvռ}2IIf@[UيcfR8|tJU}E8S:X7cǓ/웠\J+{kxDfݖqx , K5n Ltt`K`Fx?RvJ [CS!9&ř+w+%늱_'0ld8ܳ]OF{: ] (Fc# O߅~Su ǓO|@I>2hYITzBRJ$,]?y̮]$adnWYlCQ;c[͒Av@ki&Dî4Fc&4M8 q48&J9'|Lޅ+ abٚ<(wXhR| G4:G::NTM`1+qZj7le2:,W:!bR30r#3[z !t3BxV({w).:)5HgY+2,ϊFEzyC+)2w :$kvmfAg%>RpùtCT ׸?b }  )@S*Xu*54ڡ lP#zY38n/Rk3T[h=l}*ƹ}u|:`z>Lq # 䍯 U + hhy&X`nj*5!\Hdm9kh5-n]z,[& T0N"rۘnE9 򢛭g`\66 +dY]uTl)wŚgz.'\]b\ -#o }<ҥ׶VnB T\p.Eqy~k*<ܡ.qϰ`õ y5ebz`v)hQъDT{<P"04IJVN4_TmLKcZiSi#.ff6Ey[kmMYxjJPMWjZ˺\kdYHNvw)EPu 箧{B7Xp y]nqY#paR'Ԛ#rm2fEG\`Ӭxڋ6#.E[iE0@}/+g¥%̅],0l0x:Xg&Γɷ ozd!Zpqzlũmdr9ƦLJkv0+gdJ3a =tS>B !Yqu,zQi 9&k`qa\(jf-/٬+!EQslYb SyeK|'?AL ҎE0bnc-@2+eԚS\PV\jWy, w"ޟ&g3&?-B½ra.ʅ`\M.Tze7;jU@TqeNDi&zn7/|!`H1ijDNLԮ^d,ocj\7ljXPK-j|OD }Ed<0e3α#\Mkz2eڄ4 gPlPf>y`sz`k(}t>WԔ Q3YSeg~:BP!l }lX>g=cacj{ɑ_A8Fan]nntȯ]5Pu@uOj3*pR@C0dtOx"^ˈnb4 i8~v-Хa.o5AYVɖ()Pi[ 6:ֺʹp{*봿ǧyݦ]"]\ɝ]{Yegөj>$}``oM ݍ`wK$mfwgƗHvPWloyuw e^63O4Y`mw0 >c#aZϙY#!I^c"A¸#ҽkqsc(rR!4:i<e(i21)"Q0TZ鄒1 &*P j4!13ˌ3s&79ia $j]1;/z8O 콁H/JN *m$ԋȴFF^uD,5bPm5`,9b[Hl[- ܞ=*ZOBȘ^ Frtpj&H [gAPI(a+Q%"MTTGR TB@4eNO=<z} ϡK}T>E!0ɍuRp\:(,80W9 @e.*U_v?~R m DJcy˘"1C@K$.qRV;*aӑ+Nd} yfr8Q3tsn_kF>d8j.N5\I[k1thqt Ϲ>gf1 VNsʏ 1*/|PR 0_6z\1ϴ))uZ@vAfy({{EU/ZJQ9vw1@삶z|BY3Jae/#4Er-,Ѓn]۾oS.ea KNC'U".-sYȼUPΚ5弬+?iy̵ &|07Ĭs{E?*Wۍ_ :[Ŭ.,r/'1F#ƻIn>xu0_>6ٜ~#=RN:q9(f?S, $հ&Rc폸ʡf8w bI8Ät\G)0)3L☉B9r-%]|V_D˂x+x'2Jx @0x3Pz sZTDbK*_+LwS,4>LZ9&Ʈ'\Ovþ}<`yr ^c^U()Pʍ"VFudXDH#arX+H!AUv,wq{8vF+IM#Fi ҿއh%'F>fߏFN61nDfl>:wdnv̧(bd땭WJ^+z%[d]5P'+$ ^Q+UR @6>Vimw1YmdNկT0^L{ڕ-ho<^YҺ};@}@JGI9V \a "WJqXt,>eE<U,V!]Ynu '1beP(z9ykB&@õ r B($0hpV{tHba)DqF_P(r JL'mf:g>9ut3]6_ Zmy#1"qE.$֖PaKO~wX%HџPv=)C0!B*Caj3jX$"﵌&6MVHKDT51_{d>Vk3nPi=\eUNwimj4)ǧyݦ]"]\ɝ]{ɾg5{݀? M1pw{w FW?|&WoOigazvX[^|>s~Y#Ẃ{,p:|)V75uwknqH9š9Yrqs.yі-@fB =@uI|՜<54&Z0Fy,+;4.p$8".;[S7Hv0Ⱦtb3vܜX4 V+7}P9o I)8_/CHh! xI1np'FHZ鄒1&LjV@i6(҄HXZ,3BRϙh򕻳?M6%x]C{cս[&H.kr)_b<:*)Zk?MpR5,/85]2e{C)ES4 <"2Q! G'K ^Pm5`,9b[Hl[4gw`@VE Ik!UHΕ@MBalV=3(* %B# 9`:D$2 Ԟ PC*aJ(2fֳlg^L* "Ҏ:p)H8.Vyr+GYWQrM2 Ȫ_ne/ u;LsQ)Hk"1huB3<b8Y??h6:Sڒa2'62nQ[mQ~j;r`wwޏ{Ej1H̨4 EI2Cb^@8F fAE*ږoE< /TlSXO,z@PCpẌ́uX0YD,!G'" ?bAx\#z*# 9V;dfoqy3<-oY%hN6/ح4.DY&$\\A$Å@(B@(B@4%J9n&QRz C{I%^%JY[Rz(D)D{9^R QJ!J)D)(tDr F;iHOR ,0pxvw>MkP0%kUPFkSydAks{Tg #]H`-|?[V[ٚOJ*6u7]UdRZR5yЭtRPqSM*FXP1g&fѵ[qEzjFsʬ8E\YTmmY*3?~?4˧F=LxdN0+D{-"c:ô.Dab! \HLg@YGe9AQN%/{iL {Rt V!!: ĤH[XA`IC aKI9tcѲhZѴiCjUkJU,y^Y ]eucͿn$,ʪﶾ__bdxIZъ0+BW!"YY+;%1[5Vjj|{{ ۩&X`T9F$"T0Ԕ1тFB*G!eLD:NwQ@|<{>|Y<+/Z-Ǜ`ڡSK%BāIgpC#pFt&q1o=a  =LIVi:bZW@5,w\%Ւo]ǰlhT6?{O6_QQW{-˙O}HdS6K.Z& K߯{$ٲ`) xg{J*5A\%X$@JRhr\h=P ~%kJrP+sh승):lrcף" |ϥ@:BH})EXz4]]c@|tMc:{L(>FK@xA Wi@)AjFouWye\8qrHſQ3}D sB$ډY2!=%p, rƼElPXSr*=*~.#T|nQ۔tsT醕4E] v'QcwY1jdEL)zn-]hX[FS3! ]E]{LH^VЍODN\ '#1)r+ayNSwc<f J{Q aE-'!0OU5x.wobxQb.; 聕}1tvoU\@}J@[.Z7~;W*̉8\ Q' NF 5{q)W e^7)MvLU|0]h2(q==\8]X.׭D9>?Yp_#/f6:~Z UbSi}So9b_ )002eh}zib;[&EW?eMUɭ)$ʞ A ؼF|Wܞ~R{i86/#y\*g?ԑ zȏG"'#Sȡ\Iƌ1x/!TGOL&楖.T LY%xxyܘ:uFb/"2`Zƚx4vGՔz#7ZvfV- '&ysL $k/^x I DGrHpUň1``Դu1b u,/e^:O +[`)W |:Ri RxR-+곪LVe#YUYUy0$=N#")Ks"\d8k[B| tOB~L9讆5cB7 =ںp?|~rywB3( [fTg}xfTpp}3J!X`F¹5`{ md0$*7 [0`@y3O"v<@ҾcfR;m!\~(A䪸lft4 GCbf2U1!!Bi #`ȑp f}ќjܶ,ZpT1iюZVq@Y2w LkAY&=0}qiB7Z%*q$refzjן5Ke=‸nC se׾q:={3p qm#5 -wT W$nc>#Lpn1G, Hi!BG84Pl--(U81qŴ" xos?dJJw*_>Oɗww|=<)hSfWyCe'=gL SLP֊B~ܝ]"&2͖)NB$ަ)}c!B}W?!!-K-FcOַ.aQJ: ʲVxi{gkalsc3Q 1r$X+*R"a2t7>$V8`n3߂M,>=z?>;:[^ִQ/t꙱V7IvǶ?-Ǵ`it}hl?{Gq,iB6a3J8T|P%kZuif;a&l|NRQ3wIQXI [{a=c8<,xmy]*HGizϮz%$2n>$(``gnK{>pDNt|v{ $ # Q?(Mx-,;:QhpJFc;d&zutȓ&kzl.gFVzKjkzuīgF}fl>3?ᖵ$e#);9ҖʲTtgyk>Zx/UkaXT`BJz=xqc_Y2B17ouܘ1pq9*Q}ڨ-hf|wFҬo< bann!!U' /Q3BSl'?v_b3B2blE#70=Ci4Х.Đ 3.ڼ1(fQ@Z4iBBM T);̮[X$QvP)>3W:,̷7͑i7ZU8ya.jA~᪈U"WG|wlnIc(Mq,81S]s,fA1Llmk kaH6M$:cvt.(fbas}1V(i,mmǭJ=ݎ]Wfzo6K-'czoIV.ߧ**\#15 2n=[ݯZ yfXHY֭!F@Y!imτ=R<5_ۓһwG5;9Og}~X8$gN`rr]otAkFE.J7ScۿA}צMW$z|cq1 eb ")"ٰdsvc#su:rspqdwW wE DXES5XՊ2LQ  79H'ZnRRUh^)|gJzoW i{kM~}jT*0{[Hwʉgw{gon?jw2W֦4j1ꖓ Ph(yjh2x}+nwߘs(.T=jlJHGfNlUUXbYjԇOc[#ApB1E 2?<0L&9PNKB6A>녋)e^kY#kq1[0)Rs _>PgcŀVftBLn}9P-N1yjzN5T5yh r2=WƶymBJU? j!ݚepemŠ)PLu!+Vqh]nF%y^,)b!ϟ rY0g”QA4g*ؽ//A`<8 SO1f"pۂs jGD0`1*xDhTEnec/rړÀQ, U"" fю h ["=%A 3l8*Ewhg 7L]JG8B%AƋeeyOԙ`A~rEv([/2 9\ekɀښv(C]i[d=2a| 2s]w&hfv 6+1]iA\aR$갘ľ28|?E:`*-B {8{N(uj^VnW6g8>>KK?>%^#Ԧj@]}WH]>4ifY^HrWp)xL|4p 3I(HGNL &]İ@|}K7#% ;ĵL}ՄPR/\T/> 5RK_ ?P- 2T#Cya;/i㋘)~kb#ycE0VfK@P"Vy2LQB9aiv\sU3s}TS],S+EfIEJVo4F0Duk#Qt;>?Y &?O/0a&mzGu/ʔUbScM}h I:,T .\k()ϫS jQZ/A\{*0Ȧ ,u0|qG8}|HQvǖ Ud Ph۳ѯb28+q_E?vg]Xc1o0FvodGc'wW<$J-Jn v[ww(à l =Rr'5ss | f1~X =(]O 3 'SH f ߂~D}A}V]ᇋx!J׋f݀!z9%ro9qlY΁?ÃwSTxp(:62坏2aЦ~C=8TX./zsu! y[Ћp 1k"0jc2=8LG 3׊Kd?[m* Mש?6%,\R%j_TVj L,7YIJ%`xfZ]0oʂ[eϷyz^АagbƘӠ\(5F[847+4+o# 3}/Xt̏ͨCF| \VC#8o˭z')Moǚg G kM& =`HD.5E;Yg׻e؃ ӗ%z.3S' ? [,aSʁ B.̍J-KV4'ox)Ўg0dֳdUfhSmAsE\p_`7[u{`heUN!+R 46E{ PRdD)1lf% ,cX`kd]r)Pʍ"66БaYqGR'*R.k%!RHDwnRܾqBvфގg)Nz.[ *ha0pt .ztm;՟_)EuMk{tA|kE{ވ [;,vCtmܿӎq)3%|;jv}c;@HPio@(9.i@P^՜)nqw"S,`3,:ʠFPE5x<+  N *8ʝ7\ (>NLj9+j$~B@J6i;=b3x֧tJ0t=۹\WI:jaG$vIjYeoU<1I7_D0)עY WWWWWou.5T[0_|>k'f,׋ptk=o; ,7KY@Πˢ&s07#;`Wp^W/J>HVłMH A4[wZ-!ˋõm:r1ھMQWߔW~Wb֋ٲXoo޼.$_gj.צn_4N>׋TSuYX6U4_8Zj}._e 2ECH3ƒ"Sҥ&KGp, %7 ʿ][!D!e!x0k5f,`ZF 16MVHKDgE 2lmhZ0T阆 ϗ+j\o1Ƭ#cOLg=1ti=[c>Ik'ytjENC]vgav"$ZiYqDn92ǩy/$m#q7qzڝ!kz|emdw>[]|ejm4HhV_ڶGvCKFd}"-cn g~aY+wO':tщE..{R]ޝ)vٻ.ﲼx;'mMèq[M6[O)Q]Z,E:~2Mq2/Vdf,ӏZVn D7֗d{nV@T*mSsr׭I`n@/ϮZsTZ6ͨ7Ro.EPPQAsݼStzp. T:[yj1Hͨ4 EIJYF opHAEz)3y򉎕OB=) Z!sV̈́u6aXC_Y 0|ݩm*t>LBc_&z 3pK߲KD/ DZ{\0>ds\R$ cA("?AR7xZ[ceݴZKY$d"iY\4ai,+`/3ƎNvJۓK^_}%(%e7YuzVzfE}拗L#ϪQηBb'{)W?4:¶^7۔+vs%+ᱯMkUzDsȯo3@#zb@=۶[0 ,:ρ_ d1~, ڶoܼϽ Q_{ț `R;ɎU ϸK'jEՒZ nSxФMOO]>w7{v!3@ ٱ܏! OpR$`Dm>:As<18ѹ8Pz)fZ(3A"sBYξxjA /w F4s㬏=4ꐉp_ -Վ 4C8IN=C&ld?AvxTHnz#hs>Eg ˸yQ$W1;уmM|y[#Kոepu->eK 2CIv,Y0?  ):r"Lsn~w;[f~To aFfRd|=)-Đ" dL|NX\Ysf-b^Eک5y)IM'H\I~{ӅTkϗ՞o>풸i}vt&O'C0!B*[X1c2b=6MV8]u>NA>ONu) -F|0ar8{+%R 9n4c>[->&XɬIkVsAkXU>>R}Y \?Kc^ T_tkB})n@㕏Э7NqS7[^`A ƜZ؉})aԮڵQz:m)U\73O_~^tLL_ո#,m|4 j-$ TH'K1DbZD"rRu9:0Z/H"88fSbP A|@N@`kA c}9@AӅbulԱ/HǴ (QHLd$&E:łKRXfHuc 6jZ6h A)H z/w)ժ|uŊXޱX;Et0&m=R)C)KMC-N#$a7_$#<˙p|uϾ*nyu>,\xn᣿&}o2C:6X/yӭ蟾?*lf˰5@f2r3MLȫޫEy4 AB/06t=fsyx-!ˋm{N )>ʿ*>bAP1:6j- #u|[$DmU7qy^l +G_*/WYO7ÙYjV@௓,7.X60<~P߿K ;gLr9?ŝ{J`Գo~t9[3$ cim4FP5. Kk QpY'Locq3.w,_H~n&Ë́!VJ6pA(}?3oz}h~x{n_:U#ez5ii=IKN"??/,Ոos Ku]%u}1,w-^mK_:9MZF[Ki8= g%@XdPp[jWoh`"_X ʀ:(oYjXAUȮb3Ԁ nt f0 HTgi!? w`@$7{A#RsG +=͋gq?4@@(0LhËUU@a$ }5< 6M]ת8/:\ :ܯ\MՔV/RX%Q0MUI6ԓWz^7Wd{,` Mrw?oV7luz yYx!Y ='*c jpnW20?tV߉/懳q~rpw_ouo FczuI3t1]a Lbw;4Bt?+@׻gF׳ t0!iR?<%1zf7d>0'<ː, .}~]Ќ੘%N:zK:M(_ vMՑ`h^@w^~$ l<@Ѐȇ+#}O#+&+@:_Kn5S'Ukû?1_S H!~-k>>~o{Q-vd~FXPX1O$յmUGbQKDbHi:xwG) Q#Z*-3atjЯ9z-TyO9G~Y4ط -RpSʶăYlPW60 ˗@f)oQtTY6mXv(qw!+5fc֘-]ބN7Mt:݄N7Mt:݄N7Mt:]vtL"GK0TzH$GHO5iÌ=kY IY نJF˺Cm:,X;2jr@}u.Sྲྀeˎce*0E1OUesbt:xc][QSZ+/z. lR5<<͓~!'yG$)ëޟޝk-1Og(/G~i+ _UVrL*5[,x]"2}s]:=?mrNکHhG6wG{w;TL陯E Y?aA#Oy(9"o:Ǽi,;riiWvrXJ-z/oyA)2{r=QkǼvN|nX.l!c&q,n iYDe͢VS{O>R\fz@oBvutE5z]/Q=݀ i%D X ^f+mec*Mg6mgKR,eH3h\K]QXwYVLHc.jn=sWtCcM܊Cf4\0DU( #2#Y;l¬ì_$uNfcqxvr۝Qߨ%tw(.Ꞁ_- o@r<=/xSؓ %Tv~~))WA>\l.z;1rX汵,/`303?>~gޥ^6ޭd=|`!|K{X7^O9rrj:e ȗmqlT?fqwMh_V:>ӥ1+KL=6,FeoE@5@]zv ^"(4MNbg3_WuEu7w-t\'A5ْ(nTrdՕM{k6iQKqzȲ㫦n2Ol*Vej,P5Шk*-4'ۖ"f nKŖBs5{۞ꀍXo94Sʨ1eӓmJ)~%)QIxg\-s 7DB'nJoUиxI$/]BI!@Nӷ;V,>gɚݷ;U>?>:ߜ(X|j ~4ov 7MZO5:mk*-g]FIfl@LSy2]HCf!4jA=̗͡pr@WL`/Ƕ'k*pY l@=My~ ?чdM# ;? ޵Kl>_vpޝ>?9uR5Rn7G487Tni{uk:y%?@j/uk9WPg׽(,:>F@ '"+)$I5|k=Nʩ̻H|uޫAZMZU8(Qǟ.F*3Cc`;$^RE 8"BڬVn5ŰaxF*Prv[]O0Dx_jИ&{&@ ><=Oi6t^> ZgNt蹿DcR%cEHЛ z41jv(`8o(:;DId)GKnjz~^Ov13: KT ؍wxy{Ýw.b>ګ?P|wJ~:ݖq7¾yy6ߞWT]؃5~MedT#Q (;W"uUBS+|jg)+XUǫ PX*% {>gRUq'F.@ ЗT+w{@%mx03SBAE{!t ,1[Mn PX0Tf P3E@7'_Xчnx3LγXyA~6p R2x~2  HJzpo@"yFj8KRC+)) )TUVe1 )Te5 )+[Y3eBสYٴ|hY֠~lTױzl"^v=urae՗^;+O%eH8uvzQc4A2lWpс'E?=-cǏk֧P\x[V^q+pkKD)Ӳp~,Jx0{A'-猿+3rsbz!kg.)t}^i7ɇ&s-`쎯Zڬ+oRj65'5{@+sV\J5VO]vdRe'uߔm_= lw W|Rv?` M]ʁ]pZuFAri#'zٖ[2dXD<G"uR„!?AQjĭٽ&kn?~uM.j`99ݧX  xP dڑDUofh})?12T(,Y+"`-]hch- (OXꋽۊ ?e3^ gmb2h`,>u0ק}-(l[Twz-jk>koAKG57~ZeXꐋߚ*.5ʒ-@R$WSd*j'>ah1.E}$uoHꢷnH$5k`S%ݛD ~А1 R3b|Xd㙯-ocYܲ!+>$s!JMcH[{_9=}!CA eUձj7. ga$_3|Xa+Y z&7VܤtbO"#XN"~ѳS0A$pV09+[aP~R#p`)3NZ7i==~hO5z ]=i=(%?S0ϣ8jC1톢 Eɋ|sgR3+̩i HL9إř[_%1kBP[5GUTs$W7ɐmOG ly:CZPY>To4ɣyWgl~oM!L0o%\šר;G5_ ]ơ5|գJ`@.ˆ&5]Uh@}_^0 DzaCGk5v<# FM33p,] PJ;OE:3ҳO%l(f@ָ Sn&!F)6%ܛ,h~9dt5Zg0"3RPc]ko;+~„o.0v, ,np"y$%7~-YՒ^6(6SI= J L\ YȔF+/̡I"NnX3Ks!򐅎1J IPL:-Hp,9BB^^^ǖ}:">|:*Xz[xuj":P>RQ2- DR1"Dn3IU?sUO!q!G)cP;4SE4G "IδꑁԗrgXÕږX5HD `B gSH7hqfFo%}KN_[*>>'m<|_ w_lxGM#, xB/I}X;D8IeYSʰ|x3ݚ%kykߊo&<W[l2͚75haNk[υyIC 8O9M7]X0Fe=ewc2"HOT~|"K"X)ũrFxkLn6K_c[Oc-~,&i&ƦU\p#}{nu?ü3Fh# !dM8D:wׁz 5[jnN!]gճz?yVoS꡽ly9dÞ{9r5C;1>{ pTBkB ॖ.6Xy߶c78͟4)R-LLg#d.{dwkàk/ϧMOO)%Si2[<*t_[i,Nćw=y7}{|sJkrf%5MxO>0+b35֤m{lO6{>ݳt- $}SwDr<Ww0O|̣sXVu=wduVzGR4hzhF`_W)#?Q%$ÉUy jqEu2ppC(p-u2h:exu2,S 1>^Kfſ RL"\&R@OKt#U;Fto_4~RȷI ?,_X:b`e,VDpStaTW]@ًM]9\x3ʷ!W\^{^u).[9xB|F~7~Kh;!]T/[^Kzo0+kYME%}DnST<3a$t]OSؿ5?y>4s*.O܂ sóէ3Ġɧ1bCj}zJlH:|i!ָ.BVWKZ6Bu{]f㥰+ 6o8#wcu{}ObzX| Yh kx!uО :(eg&Z1鲘zs~]Xv5VuI^k}Vn>V<h+fux(p2@lܰoeMX˖;pd]ϸ,zϸlze{ ȹfn8aiIpcMs#@Dt IYP]Kij>"Z廮kļsZ>Z(s, [QSE 2d#(4b[_75u9M/'kEѮremQnSB -āj9$22C GSɇĨKu_Cbuj˻w\`\*9b@g VFAeY2"9iʅm~:?ƛ1RvJKI>Y' N$~7t?LUKRe)>{vlWʵDUU+JEU騶}%lJ4X# ,9]l+rC`WO3:Ik>9)O QB >&љX Jgf' \e(ȉYY( Yr6󬬤6`> @ s .#\ Z5/tuNRf2HҢ2U oEuKZ4xjwY* e.%QD%E"3M53|T'mS*Q唅F Q[mv\":Pu*(df@}U]8Y8.`2F(AZN&d@q΄q.j+x-0tl!9ŨYQdL95wDmHIL\;'ׂO >݁xSUWڇ ՆK,垱WrSoIGV1<ز%c0 y-:01mVkp ؄4Mgw] v;v#A猉dƐ(! =b;lJ)$hjjb{yD  %I@ک\:UcF~l+KiIMFjl̀ݛ!g5u#u+KyQnx-2_}OSx0=cCh0"v" (.1PtgjuطPQDPjĞx g B HRd%Yb8 3WA/n"r"d_{yhv"H)t< Qh[+Qm:b8`-;,ÕZjbeb!0Ej,wq]lÍ&v~fǗൢS 6pFtH!,qE#cŦsνה=LEe)h>67+|qWdi+JdM/_yaştHS~svKI-2H)zɄyLx1L\u9 1X̏{2|;4*(BR ][L(҆JjrܫƮZ|f>6ۋ=K]Ddݛ„X7=5d>~-p|*1 [夠U&`t)ڜ[HQ3-cJ(koE9Tt’~dV"j/64*g0'm@h4}t?5"Ka1]Tx/&h鵌C4Ej!C0t)`kCXׁe²oh2eM A0D۠Q"cVlqn= TU!ej#@MJ Y&< 胕I'͌H'@:.p}C`'hYVVvL]5Ž=UJguŇH{0& ~ $1uar. 1p7F`L:-Hp,9BB^^^FPpo8[,O'QXN Rr.RQ2- DR1"Dn3IU?ǡJT!@2!"D3.H0{ *zfp*A@$)tU O רR2P_q" $K%3)L4IW=3FCj$qb%lvWwUk쿗hG8\3wSXY6 F oY|6X ҕI?~Y(N2o2(|~CY CM0I1̌|9 ILgq3`uSSd-6H~\$pcVITE(IPЗ-"GAjB]y6V$;!VaAk(lb\"{?JQ*,p.!֍Lʑ 蟥; V w0El.Swffpmg9ؔ0y7b01uiRAkB: [Q$UXO.'-0ׁTSR0ccAfM!lI|&Eh?-^ḃ$X ZUT=i0lЕ`z jIŝ%+؋ aUVϵ| [WU:%n0ew^Mf'MVnz]ʤQ8ƌYVT(t_؀O>?1!2Kl)à O@{Nj'UfvxGOݗj"M.bv L_bE^FBDJ ̙F;=T뛒qR58TXx(}.8E{UTqC)1Q HGsVWKv񨓍Ud╔j;$2+N]> a}UvrR37rlav9QcJ*6*% ۠LnL )/lӗY֑ic&}!B2_|t&ݦ}驂]n,YydcOL.DpCLDYYh6^#fgKQQ-݁5M=h|ڌ|ɿC^C*craؔmdckyorofG`2O̝f*9U F/aN(Esfͭ$G~Ze5vf׸*'51hVw~m37vк $sOoѾCӭ$Q|:V`f'e61e߽&Na:Rj'OR0;˜]\Ɠ巺1޼'l?^C4•;JpbwKՖ| Cc;B!~2G(m\ðpv\qXt-8E<83tny$NLj9+j$wqJ+6nI4Puc}/ǖL>?ZzOL+IΫzKS2iwAҜ rƩȕt.ז wB/QlK9r6>eD!e!x0k5f,`ZF &6MxBZ"Z+jQUjP}E)W]Wy޼#r+iLhg5;22Ρu1:/[os4͎;jvjBZ.j$7v>Ϋ z.n+͹4 ^WVrLnXiӛ6dV?߈݈c$V* k;S`7+PW &j>JM'X`cj=n8լMZzAC9cNG)qiE<(}v#W]@w?re^\,v=#XOo5*Kp|ylJ)BTݻ88_ldf`_2/4O_j׆/]p`-o,m!L>‹6?lO]kv|Uo_԰כ^oRo&uW`{YUt 0wլX t!*hh3E)^ 1IU"=t5m߽ ٽUȬ*9ǡmboiUDDI}}/-ηD>jY锷I ˗Z0^\-Wy(?7E6I3~sCT^bCiYoOߞ.7MzKB(^Pʅˆ5״oa~n/v-b-6MF/6G%;_zk1Cnu)Bwz/s)]QhkZܡw ۦ+ZW,6]{e1x;s涛`ƫls>ѻO;Zx-ıfK̓LS+K+8X4](]6vuQKu|| C֭ChEKc0d`s 2kgf+cG$ca9WXת#Sc2D'1DE CTV^ͪ#>9Mh(:4Oe}7>zs/,09MyE&g%ƌLt(AfpD.']IH*~ @%ǨO8$]L\aQlfFގ/ gF ePfntdG¼(E41_a>}`2vo P)QìJ~v;r ly07I~Z8%`VWW)W \ dte}W|F8T68{ Yr㡃-t Ў&IjՕ P+=M~Gs;ʟV*sBrPgNh$j>vJT sYoN|,\%;W@.CT;\%*k{:*:W\0tg e3U(JrpT*)•I!i:WBbXW*QKѱUpup%%Kb݁D+ $%jǮ=\$\)wDcD. \%j{WJ{:A sեP;,Dw3pJrwiE=\ YOϐ/W{KpͽދJ͏I\e=FDa!& \%r_f*Q)UWWVXPt# _'h)Mm`8,V.4fÇP.2nAΗU6UeQP;|Ut6 դ`tǷ̍I0V]ara> ,ˆRP-߲u1eJb\cqI%:eݮX NW^ǘ*b.{;W)ܲq%hl Y9܀(.8U|+XcUߨkQ@fbyDkPA/u K*/}}läxڬǟ^i)w0sAL\kh#Su嫹Mfrnn߰3Ϯ `SQ9C)*,cO,V3A RQ(Æ2H)F. ~ W$Q^I ۳iW&9/~e7i>}A.3DBG5 *Z t36/ ˱A25A t 5v4VjbeXb(E#mؙ4z<%1yaܡ_ڴ8]}h`"{&CDi9lE2Xo 1Y,-,xs6 li6LI JRVa.Q0Cc&0 e4zl5ͭ*5e r\M#[Ba$$$cN-xR7L{ M%a뱄 %@>P&A:1[U04ɔw R`4hǨdt:Jä9  bR Bbv:iv h_0 Ta8F'JjkP2D[*x"c!tsHNZDF&7΄(9ǀs()q{6T+[CvJGkA9ZNcau!&PfTe@'` 5!CCfH$ H  Oh a | \KLg}+Bvub]}DdR)Ǟ=դGԒbc1*evY}9ܖNIbѥ`D)Ȃ14Jt)*k_cbfC h#+ -D; d{j]j};aJ$~Pǂ ָhSk|Fo xJ\^XTl`t[w\ u99GDiDM2U \.JC/ Ɛg-K7 5!Q[]J 8PIр֠5Zp7Wg\. * a -u1Miڜu膵-`Ѫ J(ѽkj $*J'aRO`q\b ~`; X-V-tC*d4@Ԭ4DfT @6$+ s{ `< ABA\Z}2ɮ3O%e:|=@cYf22in vEm%j|eY z3tWh%Wk 2[r`5d!Hc7uӝ1(QdFt[sV :,j8&-0 jG@A.M5.!bDjpPRH M'D1š3إ.J  P S1 F@)Ǎ UTfґ5# 4}AI6dRԕhދ0 ]Vkd5{RRRLr 9 VlZ52J-J)MZ nmXE@BCk>84i@gm| 9ݿEŌUDcMDs| ؼ(DPNRB"//0rq_r/tjAգս H}6=D* xs:p8O3G?m;@V+&{P\!iZ*#h`&S!yCs ~/A\EYьmV D w*2}"CO ,[ 4"(Qw`ҧ9HM+Ns  Ѩ/AwAK|̡P=n;t ((L(۽'XRr.a3m>%(ZuIBwH  *-fݱ` &4+{Nw$= ̓ "tS?=kW!*joZ 05^%k +@(|jNa¦3_L ipf PH('՞ +((qKkW\I d~;Bi8=֩) KCQ`PrAwf5$-]I4]Xzꦀ%Z4yH'3tE#C L rt )0P5wSh"7˾Wv*07 Ф܃[ρm}sP[Z ?Bk07keܴ1:9oQӍHgߏ~in>1f vU) '/*ft^]N/j[n~5ћsyٛz#l'RWIi>cM+\]l^K#Bܔ}5l?|/֭>2:AzQ:0FDó7(:Fd1`:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبFgkAowᐌ:hZQUu ,>{BY6Dsz6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauQS;$(1xvu0Fz"Q+u^QM6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauQ'x+C\whn}}9 rp8lϯ^ W&9 P0 9̊ XuĐ{pc/>!Z+ %ڋhEl/b{ۋ^"El/b{ۋ^"El/b{ۋ^"El/b{ۋ^"El/b{ۋ^"El/b{ۋ^Eg+/K J8~ozì}(vZݣ'|E8u0" C!Zs!J+_ѭKO~? V^p(pEѪኢ4W4G+ wS=w(5_"\YAA?p?[a+6W2 p 믽V6sڅf_gg zYtՐ|nsXzN\?K)t6%́p0׈UV?{(JEjt!^Ot-b0zה8|D}}t>U]_{vs##m]E-"*BжoV~e=[-~}BGz|*7ۏl/o]ER%\Gc'|;=tHE_Ɩ?MAS˲\}Ajn: \vO^}`~zkc CEč]\O"hnSQKXkHWO}/jqҝϛrnK(TG-LdjQ z}n~%YtV^ꎍWpn[@8}/#ml/k#6wcCD g,6ogˡd("alAd;w޾]Ѱ"_7y<ҵ).7֯wmw`_h oz[`&q9ԕob>mz%|-F:KfZլ%탋ޜ_ͬ-߃G\q@@U}y:x~Gm`oWaӜh0׭(_ʧݐR.Gt #:'xq<^Hu, \߼[Zfˮdys bT rvErv 𡇝~둇|Ngz{HnP\ [Ӧ >L0ˮ35mznSҝt:Boz!L@7&c(JΕI~n$R*%qR[eCf,wgЄܴ $ɋe:8/k<k=v_{΁ah|=p[?:JH/Z㽘/GX%t WnLg|x.0_;#Gt<f qԖiڪu5+ ;ux?ލRT>5c 5+:F0$$4wNR*E;ڞ~|G EX'{> NY4 )ǠM {bm('g1uBSLC<@x' 0֕/xC*gPʆQw A4[DݷuOtuig\[ $rT< mDp{,|aưC__kjv!n{fQ!n%w%) d"ʭ(j6D[muWRbU:vZ׷re9ǃ\CőHLkB<.hMĒ\To$ց}Ҁ#*Djvw!PBKqIpi,Eːc"I0U[4"w5o.;\I> 55 F& `c&ˉw9'}3![{¨qfn(pjS>).PM!g!: ;NYGDAE'ʁ\qCV J7TCoc=g˰/kMHȩW=xrf 1!-C"q̍6ճYG^yȍTK<:8??_ӛ XSǂi)cmY 0Ne"Hj99耥7yY%d:U9_m/w adf͘%pyJxvhXb#VG#$KS%JQ*d-yKʬH~5Lf+}r'_4_}nv9Q S쩚"Ә3C)JP1f t}G\+m.딁'xU)J2+c- "4:$ xc, JGBylot _FB)+X8Μ6vhʘ2D*%$)xߩ}ùK"?mumǶ^w?=A;ADdӨ]AU[ 8JލY00 sQf[j_>?&Mu_хVԬyWM(49 /;\}{\I]@rB xM%Z<5,> ;uPĠ3VLI= G9ޢ~:S8L~ԝUb{ϩ=t?AYU2uQ&X( ;2|o N`J0a<;%#تMvG*F OH(V:n&WPgx֡ݷ߳P"~kOl5ձT}meee+FUf=F޴6Gc7kh8xYk(op>5-#x7'>0헐(s:YW(o߲i Mޓ2ϏO.H(3'yCX]AQ>QJ&rA>soSH,F]I@n/|@؅בUlT0R|XXEX@$bpw*fA+[~ԋ#O|:o9Uwk__%}p/ţ_z W |--\3L^z Lॅ'lh~"VYv+dGᣱ!)(a7Q;䬣'I7%jj0I!|yBNB\E$Nd⹝{ZDS?URaJ{$"uEʏ;RYΖRXp`PܙUm݌rdcQ?fT2*H6!{df,a}|Uvط>ݡu?4b#i嗨A#4e]X҉`꧳{L.A+/|%Ϻ3Ϻޯӥ'ݿ-M g4t4w>nMðNU.Y^ (vNKDYeEn\ܜަJ`ۋ |R\.,w{8-m-}V\E.j4e\B.k]$sd,y>\󬋄vʼnbk'W SKk0;ٔ" 7lٓCen@?Yr֎$m閘9H*zf[hn&`E#tX^BQkNj۵ðqc1{9l- j$?"78:;&1a?mg+U&qd|_XOq0 u&tR[rϋ&2v٧dq9@2U oO'=~<G +r0z/hgb^uF:fo!Jsy@x1WMu8C?+w+0b9ĭؐPV6O9W%#I:DB~#q2 񘻠 6KsQq ygLXRh@2DE &V>1cs J_v|OC*Fj&&8a.x!GBDm5,JQ[*q*rϫl\7訐ijinC;Dܬ<gk"𵝍9^kwLZ<9pܹ;#׈JZmCvdz݁&229#̣Q3(;WUε]vepFn{N[߃Axm_!.=f ֜?|MG9 'oneG|#.*_OL.R>SKͥWT2]* |ϭVPc!3a378ZyQe+~E]u뉐+RW@)2jUVKWWJIJK*^l8j,>8υ8m8B5oMGx2}F˽c'z=!x0Ohܻ|<>ˠz>>¿9 ev2g&L/{E^e9ggٶKιh6}wq//y5}޵.\l JHg8 G^kjE4[vx0^r K|%,%\’Ka.BDFJ.aE%\’KXr K.a%<cYr K.a%,%\’KXr K.a%,%\8Ѿ^J~dT,"mB{ℳ[Yp BXq?[J.?p8olz\4_\]# >^ASg=zN~>"!{0v rߛWxPR8()9ӞSE1s!Ă2S5ZN,X[cmJuL0ۖiQUf;lz4@DYeEnܔƁr3.]hZz^\H,Y|Y 퐫3 V PW{'G]]DdRCDdmqYA7ڀ.I˭MNj iBBs[XY@OՑiʏea*#К2k} ;q:ފf<2Im<ުc p#`r1 ˵+U&mX`6#g|y_d3}ފڕ4nx^"ʇAQUq8tp'_r :a%ZӯͿxA;c"83h| 1%#\3ʯ`z&xLr-ӁZNLT.2.?M,]z†9>Ni֧B?OC#hO *bP`OԗR-/.[,4<Y%N8X%~a~#j~ Zϲ'WrCg|2gYRc__2?˙jgH)(f -Geip);X9:W͙j^^zU)]?IR9 KCbG*$h1/A,!ߋX;T뀈E)82oܳN\٨^fhX͊u,q0#U(fQF^`RyɝAQT`IȍL>&CE+$, wZ)ယ= )D%T{ ^.$}HثӜ"cE>M0@3bTiʂG%Q,jRUU8-V$H1%%Bcޫ8]ۮKD+~9_񃰄,1|NgJsU@2IYGݷǻ_;nO_VQn_HKwE_{SXRhuKlD1!V8R"(L"%;VNy!6ߏbp |ښȒFxWdI֣e%u Qlbɪ⯊*`isy䤎EvcCBq+#m蟙gsC [` bEiWEpDO@r&)ͲU,>~#$_5C Z_n(oˎ>RL`uL$$h"`.-3ΈHO Mh$wObƎKp@ N~'Ν FJ@4jʹJmw!2 VqyתY,SQi0FD0Mg< ͱNEa-Yk4J$13xuUQsƔRÞ"#6ҜyRByUc§8oM 2hU|/?W TIkEqXGgcseE AJ "240E"©ڃZ~6(vR{}(h)&&!JN6iGACuf hEq캅yƵIxb\=V~}Yvߛ aRKQ)kԌV\*gH!Hs/ T'(X'[>9FY4 )ǠM {bm('g1u?2"shC^+eP}hh&ˤPyW}l;Pdy\3P9LdPbAǂ$n=d1g0!@Oy73ˆȺs9MƃB܈ JJ4R(pmE4[Q?liz}NxY:2|KqW|?ri9@p?g-Ӛ `1$1wu`s4`p Z񮻐(!Lm%8}='A+)<(f.oL_bXnzog|wsݵ<<}@n g}akٗ`J{.j lF+Ph+-g;⡜C0+8A! IQ(CW"v2 ;" )٬.!@К]_zO PƄp©Is<4#FkP9 Jf]4w=uP(:y*[ ;+Έ^MEu\,kwESQ*b[ެNfY9*aGjjŚvJ@)vQb߃x6Κhr~a4DBNjTd4C i dc5ubAs[lJYt}g0-))9ӞSډ>_M,(8P#4sk hܴVyFs:LWUf͓dݶ k=nӨ:tXoHN;kd4Hc,qjk%N(^2 Җ{^:τ(#LgNA;%͘2D*#eH i[%咒WstwcY֣sa=*#L;ADdӨ]A̙ }SmMrGPOITDQZ'jX5FT{]]AzMr#Eb,2fk"8 PPL~ *_zQ&'7B4Lo>>ߥЦW!=["ЛAB0΁$32fTci<.;_MƯ?lX&NbrahzvzS嘚=\}8 9EPB-߯|mhZbFf_ǣ4!VS!V^EǍ5Њ)= -'qPBYo'w_Y97 h:>|;"/Ǣ8&|ۓ6WO_ `0;%#4>t{M4l?b` وt`bvr@U'k:__PZ"?~|Y[1[gl+\&o^ s`݀%ߔCaӫJ*W `sItt Cmur r,kH˷&oI#pǕEt&"o` {WEШ L( k?soSHL_I@_`IE ϭ:`6كXXEX@$T$D͂ mgZEn'گv>ƠsDˣd_k#̻e3X^C| mx$ݬLY9OX@.jk#\FL:Mn:6FR2"o2q6 F|W9x7 *o q.HzJQ!ZAXQ`vg<gyuuPܷ{#՚!nC:pW'=v'zm,X d9o{gMۏ2D'F-;GrgwCt[1Bѡ;@3w 3+E.NaVKS4);ڕΨ\uUrlQWWLaQW@.PWUKWWRE]]RJPI;r8 QW\tr,/]]e*iAWר҆LΠ+:%L%-Q]M쒺)={UV]=SQWҋ#8>:\ϫruu\G+Qkh-)u*2^T2Q+j]BW4$QWܓI]ejtu+TẄӺߪXΨL]QWZ/]]㢮P] y|W`%:2tE]\F͞i|uU5{S; nmaNFpYg!j^6k4/դ̊-8ilTW%S)3~T( 3dh'nW>gs=P/oUƤ/f;8՛]JO3ojo*ϿdhvR! V3'kpWPkeG%S\!!Ԭ”%{I^%{I^L 0 n3e9ƺ7(8C߂:NreF1@!!)xʹ*Ю%hjk5FZ~%W)X0atq*(7o"B`0i lT^J]z vHͶ@H("lqš[v"-sa],ZE"E%( 'A ³կ7ks";|Abۑ"eBbMKjHG!*D\0%7SX[ LF..R ^Z}m_wh8Vµ/8fC*Fj&&8a.x!GBDm5rT>ROq _$}XadFlu.7}ĆA٩7t=w=Ϫ> ߜgK42uhl'qx8YOo S{5skT=Lpze˃?UAK4=?7>J3SbQ{ExXZ30jHnj4疬c<"\e7(\J' VIf\v#렌BPQ29X[ViRݤ)ͩ5nO<-,C7Kŝv53q(3ZXz4y p 'yPNٮҝSS7V̑f)K(2P#rBMX2ٞ+d$ &`F)8[ֆ%|sb306JB9fxiNNQ@l`,* u( @!$DwxdՄ,chBBN?1!.qN!Z\u \iU4HJ; Ag'uvҜy~6aHe"De 31a-"H,&-4+TTDjQvѴѸ ̣JOf%\餪v x+K- (Ee8+\YKgH&)1HIZIq[{C_6`#>෣O?!gp};S&6JӺ"!sXTL^<8e@ilreF<Ƀ+BU_|m߫IO˫Y6J+A>(s6J_G늖`ˎfm^‹B\9n %u4qZ0Qy "$y㻖;^QuH?Kjti36#pZG6x?qZiw("j޻+{Vrſy oZzM fwWJ$tuDy7şn&l&8)P 6PV%[NH~('6^=OI_,lFf˧o/sa m%U$PЭ:ۡ*@S@# '76F9"0i5_)ZEەΖs:NeՋm0pҰ,+wt|U ^R F2bnJ$uAiM@F /<DxI\Uxaq ֒ aZh%cZFSC@q9@ g1cIPð~.16oѼTVcޣ2 $T2#^am*x(F ;\CG폥j&DQGH BBjc`C5QyX t.#=NfTs(e\xn!t3C(n6?[1DcJ6שeRnYRl:uXr*MY^8e\M?dM`I0k5emfb: ".~{ԬYգE iYW=7Op]{bspqf6Frsw|ƦW3 e_m+<{$bYpvYyǵ50kewA[LhAgGjB}io ƀһM݂ޅO,&.wCw98nbӓ9Wd8m+;Ir7kN9甇c7\OSΏa`7"84LC @G$4E6yUu6k|~8,3% P[G."ͺdp\.('qzz+LLX[\USS__nXCjp+wp):,bJ{D-|޼:G6/Qsx*€A.њЙJ7L;f]|˪VhK5G|"=^8>,*oH"&qP1G8>141dEp)&KցL?ϼMNTE$뤣UfI&H4qO(5cۍ[^?YIj&6I|KΛ7;pV7}ORPHaFt!Ra[D+R ӮS[_}d(zoUE甆Y}g]ta7~Q%xԷeѡfä5o/s^. +:طqJQ{}+svhq5w$GSX;w0oƄHˆ?ƛZ$BBC"FFYj֜I-$gJ:Ag PߎoF`C{WڌlrYE-ajX⡹Z6J'ܸڄ}C> ybR+uc=eQPno;;XbEn0P:~,- [⛌|4,A?|%D_JVu~-׫rQwW΂A5'ԅ(mMkɟ7lUgr;M{2._hn[$ /ޜ}Ǭ#7r,f r|QR\B,M0}wћ\«e\'7堓'kkǣWT[pj&xLDY0B&Auly`vx΍Tr|1.\nC`Zν<*% ibJ{%傩DU p&q.ZáTwYL=om9wB6-N|ENިYF`ް-ڝ>oW0=w?pg~4nq a4> W rʟtGoȏneJ71iZާr16\Cq~zN5K{̼ml(7LQN|}#OAlKP̝, ʢWyd{2pZ+Lgp~Safy , ??{Ϣ۶AzzpsӴmz{ اFUQJ,)ɒ,ٲLrL#%rI.=3 TX?gb9k"^$ݹ{ϊc)GY$–Y>/hqzd:r4Uq|M>vG2{V#Kޝ7W/~MV'2[l=7[4T+8n{3oG}]jp4~z㷏LQm#y ;4ju XA88S;ҁO{(_:OaSwGg^vQuQUn;)FYN?ew{at}\~}i︲qLq Kꑚ{34铪Կb[0`JٚQ4P4 8du)fױOF 9 ӳ?`YBUCH 87 XUlmK8Ȫh]<;q۫Ɇxҏ[ٹW ^l΋o4Uwt&|uX:p Iy}0e?ӠZu_b4j phu^0{ҙٚVaXSjsV|(!xrC c~OpSW;laq\3>>Wz7^xY?+j߅*b>i,`#[15wKWNU3R'GPu_v[&:|'k(+ R_`=XߞM%1eg.6Xɗ3qa*KT=ى0f/+uv.u|Iݣ}p6p芽Oao/v0~Mΐf9 T_|V/>AۨCz Ɛ"]bM.7*A+-zJ8U稿xz"(ٓ_>9mXcTg3qѫkBIÙ֮zϗWhMZZkCkmh 6ֆZZkCkmh 6%(mhVCk@QqNrkϠkX|?jX,r3JUn#!mDL&ΪĄaF!sW N9L,jgit"naxY ap՛YB+VJEy!xxezMF x$ )[X1c2b=6MVHKDc5.\+vkۼuf+0(ޟ%nmCR} ר]QQq!,$S{+%R <^+AWu-u}D* W%wJ⚊Nc)͕JFHqEH12.N`y\P1g&7B֟jkCA6k4'Vko[nR,pr|n9jy@VOP fh/#(HHLBDN&xV~FY1)f5;$ rI D{@4R\7># #$$5pFBb%#1)DXҠu0IX4?RZKi-$k ݯW@QK{]͕.|y(-ټ:ϨۡTm oTx LP_rPKn0ѕCҹ0 `_wU4խ mP.dkO_Zg1z^}ж\E1*Tw^^%*+R9_~k$ݟF \.Kd^#!AD[#!H͑SS64Lǧ9$Z ϥ,gKB`]Rh9tN-t6]hoָSڂss&v*!4:i<eHh! xAHy-yip"iJJpp^.V@i6(҄HXZ,b`A~+65s;d/X.&[vo _1^c]m\XܥV5 0'^ g6kݒPhrPS/"rypzԈ C<D+raוtv Ӫh`1kSɹY x#C(0lm &5v#B# biD$2 Ȟ C*cJ(26Lg!Ut.zE o6!fv@ tXQ=+GYgQM@2 $( \¨7wJI,E*X)EI]F E*@Ws+[U9UˍzI3-1M' Q(ET4iHG,4y HQxpM6`ZHk2p"}V0pd-.Y?U! U9.*FZ E@aͨ4 EIuDEWΐ8L2R-5/wI|ȶ'&ivFY CM49Fmà"b1>=AaqK͑L[WN1 ٖ:d!t\&LoWCƒvv:{fu&,\6R$R4!M][ 1L &s؇`z.O1X>z%<@rFϵ,w\4a-VdlZ/xA&tBW;Swi6rP0vY4a#A2 <@K^y:gA)TRKfy}u6EyGٔ{ŖZnxʚ}Џd }<iX{))qJ7D0Ŝ(aH*l`W *~>g}fbBǷv|J4-u\ ʃ~2k[h>i4uynװpcs3c[-n{vӘG"h@QAsglrD`p'j58g%3k)j3(˓k싺z5W[vzjcս&l]&m5هXMV Nkg5I׆*RՄ3[&tΑriMj+6ߍVzeG2vf['{ATڥSK%F1 { 1@K8#\:ظ{a  )pZ4<kYgC wx;Ã5e1;g JaBrՌ%#Vkdbd5gm{uRpK%4* <|@z4%mff!5W;"E RrHd 3âҌ;$*!H&ȥ]+JhB?n"En`rB3k=3Ns9Q. kO12˓kDFփr}Z "Z* ؠ03elVFS-!5KH7KJp9 e*:ր* :=[4 2mz@ADV7"oXވv0]o>8)rܱOnЏd)R2b^vuݍݔ PumBÓ]vMDCiYG KIa\{]8S6ÚJrx}}USSy6(vPTф4IxoOXyѦê;.%jvG;|" :0C 1 pID1qҠĀʁqN[?E}-r~.!V="fdl+8"aJA)nߔe8r2x?t>97  .: @*boM@ZX0ETa5P4ן\佽MD `&{]Ӭ(;>P ;=ݺ5At'͖s绎G$&f6g SxVJK^z夸^%),m޸gMkF צVDbO {+W+m¹O]yИi#h6sb w!!;?oCH\MA_6d&kMBIn^EF( $p7`W9Fѫ2 V|TSS0q8Sv9.̢` eM_NrH`p Q+KS(SĽ+h4>i,j u|n 0}sX_i [^s%%fo;M1yG/M(jǦ&s4 puUe 2‰/g}JyJQ'b!//=!8"D!X@xPW`۾I5;{N{L :s;{226Q&3?Gm/z`gNKuϟuӋCvx9ZԖTx]՚1csƢm@{'! fzzPhF43 ڌ3&4Si?i(ij3O7Ԝ#硔Aoh<9CyV> <:#tޱn7B*̊%w#݇4Jq~G8?=wK[C48=+ G2CjS[HzIN[/|A.oqX>o80ltEPPfyXc/8$\K`x4*֪zi6+ǎ7UEPZޥo/HdJ.^xR^kO݃r@dCG^&Y6y[-Q<{E;LZi.hnZ[ZaԛRo6Ro|o/d\Anar! Z l_dDڕyN.92K|B3K,sld]jya*♕boX'bY&WݎJgZKll sm8iY4dMb1xtŌX_LH2fj4Z3 #>gDʌ90l% S{LR[MdiҀ(4,PwMv ,e'_Si NJkY+ ‹C`@y5VTT8Oysr}z򇚼N+̿t}+ &:xqrB>!fa9pz 8rFd RP Fq*n]IVp 4#uޮcv6Ugߓ5WnJwtOm@=;/3߽#_+$F…8>.Ѓh{)_t4xrڤJTV5½}Vt3/EW zaZ ,]-y!Z %CEWttl#]@s+4\4ZJ74 KztUkz\gtbj ܺZndMwC+0pAz@;/aо;‹. Sj0xiw K_oxFmxͦ鴻BC,&;Kd6F{0] qy+oS1O0V(&χ.RQĶy~.e_lU<^Gf[x2l U*eL%FIcuc\58`pn&ڕzm RBl_~Oz 6~xЇ/WV]kW^:/'P;SwZ] gzUi4WutҢ:S_sȃL0Ŗ-+WaJS8*EZx~k>i<-7t>=Uo G?{DwU{wFtӱӯww2L޸;}%+yzpuu:̘eCb: :և^v_ӷ MrBw߇-𘠷t(ʬ>ӾowwI&wk7LdZˏ/LCӺ^k38a-'#/c<yf{u[,ޱ%@[L~Ûu^z4 &œ~ynz\{.F c>xOM`j&}(uïPnx  '5P; [N)$eYE\vn?mՓ v:=ݿ^ULjD~89Ԣ([FØz$H +:.K*Tﳳ߀c-%-i_dv>l_Dǭ{c@h` Me9= _7>⒥t rm?90!&?G8VH̢$ʳMacϤ.|S#PU_aEfq7*#*ΣdW+V=v'4х2'ql^k7>jA,Fo1.Ԋ+`{zrF+ pb\vVNFeq 1Ҩ#m¡SlGڴcaMd D?@Er2oKh(1ObL5}2'+ޢ Yy6軎G$& =mx0'i)cT7\C Tй.|:l Èn[Rm!͖} ;h7UFHMV8ܽ(9򋌮L<0k杧#~ne1X!KL?f.r̋̉eZ)P6!1_6#uc( xpi}r°m[n %3;9@7z*rĸ[zUqDg J\Xm$& w圗-G.VvE!S!DZ@`*1qj??s &/̈FvQF4OX}DQnZP9y ܵXJDnaJ}Xҕ b6HWD'*]`/4ڗu\ eIWoa.]i6- ]i0AśNW%%]Awqf5`^ ƺh t()- ҕlvmRpЕF+@PIWo/ nq+ ӢmkNW%qJztűcd!TpSh%BWCW|Φ&<]-9mj1/j_ -}P  芗tl#.+]!B CWb\.xB[f]t5=ƭ㫜7~J)$wr}=Ąŋ><<)}238w۪u=>{3_x jDɩxi& FUEz}Ro {ZG'EOܓ;> 6lpkmYdw.r^XQeO`Wb[#K'v-ه8I5>ksamv8=8gm#TIy~pkvRr!8q9XGQɉg}$,1l'R4~Cnp1)%6 兯]nL)VbEUʃq,&J [#B  myPj9p\?k9@8¶kRs|۶}&E|F]z;TSQp~r׃zL~ WQ|,q {gA@y/g!~8ӌvvpTk5ө?n}}tWZhY9 Sp}@|98ga+# Й~mNOcjRXDQ. AtXmT >_ŦslKG޵q$ۿB^vv$~J\S$CREVR"fSU5ݯ޿}Ǘ_x^zL,#-"̃"(Xb wZt1 7x8'(z~U=n}u|:?㬧?b*D9J2Li+p 08 B)nji 98G5x^(rneZG`Ȣ`;r("VB"Wۊ޾uz p_A*qO};G1P&c ?W9&&VK! C@N"Xa>piSA];tvn(w׿0|K_fdN1?vz]~v_Gt6eo8ei)7ha6 8}zy~o$ FOr4ˏfaәŌ+\2I.bB>-w*6Je!٠L:gI1亂7Y8Ͷ>q(N1BThc$kVC_{O(%oޏ0UBBI Uu]**<zbTsV' kCW .SuVwPmx3]jCW (<]%/:lWJWaQJp ]%R5tut%9zKW O0p5M0:~<]%+W]`CW .`B+١UBc+ͥFt5]iu+@dCZR ]!] 5+L}pim*PfJ*{ UGW UQ]J0`ztPƻ:FRPZ`3kCW :A@ oJkuSLUEW:W*e?wPئM ]=CO=Mx/mٻ -ߓwp 7tc-'Dw]Co=o~;Od~[!+6?cf.iٍ,[CWՈa0aX( &B:1 Âx>.X!UBKԡUBaD]`]\kCW :x PrxWHW<%^U,Dm*UnC  i*6t*\C$VBѕ=Z١<zRkDWXRVJpymV<]%j?F R#FUX]**,:ѕ[nYg#:}dL2LG$iTakl]Dl9A¶3;zaBzaeu|.8VZ=oVVڌg/,ܧ0p"JZmspF H;6dP}g@sy{hǤT*k4ߩ57#w9}އg0 ~?˯\}GXgTo滚ëNt>:oydf|9UKbSu*4+,y׻I9-^ڠhV͂E@ݗ>r+8q/Ml;36l}n.[܈M[`>16nKnj]m;hMZxL^w啅شmܞ{,[3 7=n `Ӆ^ř IUp;o8X1qV{tHba)HI' F>ܩ9WnkKҏKa*v=%z>/w;2vvϽ.0ɝ˹eFP.֭oC'}{(<8-r ^TOY0N z f~Px8 hѸU ez7%'ozczlQ҅7ft9{y5cVgsIM?vƭ w׃a&-Nۙ#%ВC܅g}$f'Z 2ͼH痆Ql]~(МUh._tC ;PsާVde"@3=Iu߹ig7*óSL LKFOw=aȃn\5_40 뗴Rj+e{GjB3Jy`ϑ fVH3ЃE+;pt_6 XAD])Ҍ 2ƩȔt.Ӗ &"\l:3GaX`=.NJg\oopc>{7VƋie AHRVa S띱Vc&yechnDl,/ ݼY!. 0 n. [f]܆mnn[fJ9j.|fS@FнpX*&) wqhn} Ev nv$v n!|߮/! o8_™pַk Ė?T_u?봅i.[Jhôh0J貾HWj'~\L JgP:o*K;|Y4ImWf^!̠d6]սj+ph %Wà[Q꩝6#- zN*bo ׍`B4hN?- gyP&/Zz Ead'875\%&Ô}!;NХ~Х qQRs^FwM0~͹q_+^U,Ɓͼ@\byxI^@ޅeayZzACcN:#Gcڄ{(lw]犯CBr_/\ CN lBH'iŸuN"qek\*~b$bҷ0},$ Cj8tLqF,B0#Bg}0z)#"b1h#2&"]x=\xe۵@(T2W0)0tv4-'j:b.Zicai݄${V]ʖe4zsz%T8 y Қ)RRA(Ŝ(aHz | 9-I|3?]1KػHՕ$W%Ղϵ#JSYhT=6mV^e7D3lsY7ZW{VF)X)&'D׵^UdJImnۇNYNO ݯnL>;$[on |v =:13)1(.?)3_İt*Afbm!MRצV%V ʠ_?]n2$-0ǎ]3nZƲST0 DI}m}MznjCctl&P/jج8Ny+8`nryn{b. sr;=kS?xEc,94i|Msr OX~AK]ȡ_}Ixb6#|4J[_r5iXZKdU`\su lRN ^%-SaJ3$,>s;Ό;1~T7ؓs\|,Bq\ʏ`*lb[FslʕO:% vͯV/NyU .hlo>e0a} ^V0#(6cFLqaL6/\}U;c [v᝘1T)bփGj0(^ wE%K6{dVGf}eV;tae7\bl? Wi: Ӓ́smxЄ~cY ]5oBl i2` 3`Iy]HySSx<=2Z#.g.GZġюF6GQVxw! Kf):7a+.qƤ0Ff"C&ji i%k._J9DŽRn&x92J޵p7qdd,$p$L&g7V%G-}oUK$,cI-ۂ6`z[[%rb,Z H t}~2"jUn$e0,WԹ[xM$+׌/]ܖ!D!e!x0k5f,`ZFL&Z܊qX6+G#07f{=dHj <* 9N|0ar{+%R :^5ppLb{^hU 'SWrjL B|[74@(" ,.G q}T\${`̙ܵD֣IiYUw?m"ZUnح~]4nnk@V42haBD%u yI)R:&Et!"'Ukٮ.qXBg 1ljv4H`&$pXG@t4aisZky310BB23uS/N $53GcMhi-INjS|"h4/Q׊^q絫w<ּ#k%GK!?IKd&lLv,ag6WF'N0ks5;U NUQ([˸a٭vpԑ&#K%ԆRxmڔڝذb6h5ۢ;լ[۳JgFY; ZDl 7 :aFhNP&T!nmW% S6qQ|߽M{O KHK)79 p.0¥b ZM"9PHJ 0~FJ@Să0館N%o5,$ OP+7:EQV0L;4aEezOvF)QB-0E/*4[zQ&~AF18R =`hEusi2;EsrFڃq|N53 y`H@ aZ!^!M@#0LFZ▱+ס^e-A+ۄ|y ,X2ti@<+Z|f$_dTd7ՄLLJJ-Ѣǒ M0& S4_ 謯Կ2kQc8o-i[7W0YJ#&J .úJxd!¢Fꥦ0Rn <)c"ҍj_L666npFpjm3߭ .,ROБ>!*$% XZYh6` -D>· }ZziU̪?sіjs*VK% Ӷr6W#Ynpэs-UT1R:L(0B^))"eVbzslzg3b6 Z[+e9cJyhS /`I0ڔ\3Xc$$#Ji6rYb"y$ iǬQFYJy>%@ 2%^#ϚNTs)NVӻZ|Ɖ:p-(% 7D␢PX1&(H#J5 *ƨrRgG/6**b)7{pIo 3HcYA 鰧e,& B7s .89>FeG<0iV֠]CUfBhq4dY([`=]L52#E^"ru`e ƬI* `y{ m*wT1v\.},8x9}r z͕79/?YmchV)0|908s.g3γuo.5E4Ƴ;'IpcYdXv;*L?>媱0׋ι+8وQ+ZyTʖ<2ĴQ9[חnARF*Tw`XZ[~9 :Fxl~s %.ٓQKs:<\_z1y ịUh4BkM̨3h1ٱ`p'n5TŬk qRg_u2F\Qq`, N~m1*Ĩ&$VK˄.9R!ITu YբkMQW]z>'WP"RɴrzD9 Ca,9\bq0D=6.R9Ou_+aspZ滔q@Yyd 51,),21N n駼˧h.L-cy HitL*"ṵ̆4N+14$x',?0  {k|9^ [:fͭל{fsேZ;/\מHcd- (Z8 @.*'A&"Z*clPh20DTH2*`IiUN`B*S ӻit5].峟'4,b^\csJڦNJST˪sK( N$DWKTm^RW;8;k nB)SGYS3' ֯*¸&;m8 #O F0.=˺"k}[I?rYOM)rss$BMj[D7 A'jBa#V8R ZަBiN wz?E>dL{3?#Jz=+8"aȃP J1uLUc={P aO9~ .: h! H (*\\N(4ןA_;`4o)Cog7ůϽ}{iAg"\"4}^I2nF4:itЕ`|\ia@ie;E-]v1I̓S# &K1Q%H39^4}[d;Wb' Iڳ,1`{ЈAy D !)'?3gaⓛ]r6]8%#:JIx HabþE=geXyEJFUܕ S] 2ƶVgutQr{ɾ JSº=~0EI%3ȖXRp2ݚtoM y:Mdc8M(rnSEQZ)"V?OaT:iў{l:Xڜ^qĈ9棷-S^«_}u_()g*B>9x÷?XnSU(+쿾|-OӉ/:+p(Žј+7o;ao"~1jfH~꼰e.IG{64OXA; =.6eFcrǫU;?aG8J0M)ʘ^abbZr3_S*TN M\#W0X:؛pZ˼|uov _ rP`we(.Nxp>-,<:K2e/}~2~rwy{2;O利AVW/:ǝŽ@l1wݝ%PCq$α?sC6ri9-0Awtzfp y54.7 ֿ!P`}3j^q̴Jһm˨Ahp-^Lj*ҙ;s6}RŽt֒neqgZO úCxH?sa}Rv?>}τ4tI7kdX˲hžP\jG'ӃSuW9kLO(FJ2zzEznX[4]b{Gaw8oQ`&H]jZAR+m} A+vx GmXKKnZ}t[kh>qN]**7_S*ÕȸeU6曵f'ޱAuj VgZx8_`ޜ'q)L\ ߯zf)QH"6[ {jztAo/Y54%petO3]bJ:W9jkN[3vƓ멃_R؉K]Qq;[8{Ӕq΀t Pkζ_ /aaKq%Xɍ;;t-9xǧIg9ل*ᮝzɽ)Q&T~COhdyTaUw6ϺtEп\5EINQUntp\yPֹ;wooƆMܪS;{$jޝe*v𶉻x'/?y_j"|^ EYN˛^՟ߪw?ͧF&2D,P'e59gUuۑŒUi-K}aQan=t?M ~7mO$k'`)n'`؏-59ql5W6>=h8poGv[{ xzC޹<}hy"6'40!frsOӽ ' MnsܽokBzO?B~|_'Z ;JYH|:Z"}o߆gק?螜;AsTł( @%m{.؟%zྜxr~$܈-fafOhUuћ"UKn } [sѝ6ǎnv"9r6q;8,6 qM I4_l0{Vb$%w%QK$*Yn%6X`"H]%uŸ}QWZ`\eu#usF]%ru|URʬ@u%~Ztf$]wxA= ::z|S.,$=4 urtJ#vޤ7PΛ d~M)x(&%LxY*kH)n#Nn6%I55Wן{'b_WpKQyfÃk=__+M7\ymX)#`qZXJ&cB#j{ǃӁH$V <ంX`KR6M.eKR6M.eKR6M.eKR69,g匲2$e\"׆ȵ!rm\"׆ȵ!rm\"׆ȵ!rm\"׆ȵ!rm\"׆ȵ!rm\"׆ȵ!rmˑۭ2QN=#= })d%X:JSFÃu#KkW9W>mdy?sr?ֵTmۢOmA[05; rۧ-(ZrhY5c;w!:Ļa;o I)8-)%FG# al4C BcO؂~:vx$tBf88/ LD+T4hiB$ Lc,-g!,3LNbw+'\sFK-Z7_z5Sܻ/y._ƣ]/}gx{O:X2GT:nV[Ҧ8C3mrzVި뀣֓FLx%Z9Gp ɒۚN|~g9;*Z'9"cz-Sɹ5AMBalrfP4TJc`H+ A1\1"("ir9Y%W%1ۉYŨ.?,m"0SD@Z[ KE !aU&*.( 3?~R m DJcyXh#x@ %{ "bneZA-7f;)IS᭽Uoz u8x:N}t|;cmK&cTBJ a=h` Msf^y;gъjNsD#1sG`:c~*Fޜb ͛n!x#K-" P3*BQR}R!1x/p #$ePi]*a" hYD#|eAK0<w6LX簎! Eb"BYK,x'S0VzW5V:d!D'hJi-euK2E҃sY8vfWn]t_]FO)=䷥dxF@۔ vUsa'4n:qy?$ڧļCo_M%=tHL8)~;O,T10#KFi nP,ZnLR0D@CNusr'dt"(ŭMBqjjXC!fH PrZ{1qb1Vٵ:ؾSB+ ofм᷅x|N1^oQ. @1 XIaJ {)Xd<]N'\V"EFAUy!z\(5)Pʍ"VFudXDRҹ#aR'RHD^7+ЍɾBCˌіb~lWMA5RU,v[Ԗvc$<v֢Omބ5[3G}`Kb/}tX&*s//b*05(p@/.07XgOexlqV*PWo +Ν58M{^V3N0&ݽʥnU:=wGAő?,#)޵HHӚg|&WjhI:uߪDY+Qjd\fd(UAƗc,1Cj+lW]^N5Y>3F8K8+Vyb 3^ec0J5>?ȉWƕͼ6I.Nco)фWpOUakb-Gk;t]qS4b$x5sf"|+bCY6)EƗ)j9bUȍ_Pn^fPG (LHx.yvD{"S.DjC,ٞY]3ȅ4* bL1h L> 'H kAc}9Aӆd2ɘV!!: $H[XA`ICD a#ֱ6dl4KZ,i)i{-a 45GpQ wusj{ηA(9 +K#ZW*Yg\}ę`mHB\]mM 6T N7ؒG {/Rto#:{އhU eqc*h%{Po!% 3Bs4)ab0fsTov|}L{O\(|jy6wq T0hvd2(] Ja-t\Z`IVbX]gm8+U`pv'.aRz*Ljd<A駆 z)#z0RS<{H9WIrs< q޷ѭξFǒE*G5SDcOO[,-,xGI4 Ju5Wt^s7a~_=Bk" `V[f $q gזRՄU2L#iUn}ھw9vH|L4ݫ"Sé(gG3  &rOG !FX EgPalr؄ A2 ǨU1ЎFYJy>% 4F۠-r64|KvfYF4ޫN@YycjlALm¹ˋmZuRpK2>ZRu-p;"E RrHd 3âҌ;$@*!ʓH&ȝW+JkJLz7 낰n-؝B3өg98DyB`D#,ZZdniFA9o@>\|dB!.⨥Rh@0V)YItNeAjW% r,UT!X:;( :=[@ qDDau( zXhpEWUP%oS7JοpXB7Y/;*'vyX ;uEjqV PZu=.vS/ ?Ra%y:0jtĥ+f;fֶY|[! 6#Iڳ"1`{ЈAy L !)'?3gaS]V];޸\F,aDG"i}Xv&?*\X+Ss@ĸõ8ONWQ@-X5-S!)adbGLYŘ&VYc2͹davpi "ec/cC JkXJ! C*e p e<&Ie`=M" )i Ch;mcQc(Kӷq98CI*)B '6ZM^+\L1&m *(poRin> gp(KƉ& .JV˖1Cu6fLY7۽n53un6/ ]]e<vQSpOX`\9"*FJ] ]בJAGO6D)G9=O$pJ>g-O<.hMwQq ygLX<(im.}vŚLJ8X&a\ 9C-f eNq=:l೛ε˨ŷ|庈7]<}m+xaƁDݣ!4.^$|qk+ے튎(o7pr4ط6g7;ϓs=gyP_ 9F-SE#BA}J;vxK@}/SPF"T#`C{ ;7gNDzpkKvreO/ ֦0O_;Ǚfʪkqyit8\ C֢JwǏh'?z6=ظl4!!'^vY2!ʃ@\DŽ17VS[W-qsN,^T9=:&ݩH f i)'6| 'wܞjFo @vP>]T{2'zCDrY#!Dc[V[\$.-QR%m^J4fE:5dzԖ9w;\_Ecxɦ?2_R BtF?3{m|_0 WCMYVec7d1Լfs2h^^!zw?.w0N2 ć֩{#՚!nC:p1MYLM9L>VJ2&gƀ.皙ĉLD^UXHX3JUV%8=q~U]>Z-(T}q'w*3e=ȱp`L nf' ~Osm$LHbM_|lm 04:\_QsopC2յu4%Et$eI2,:pܫ~ӳM{8J -Kɋy]5Yu_RV rR1.-Q)ڈ)JB2Q ņifu sfApep[ݖ%]*\]a5h[0#xp/6W$v+0b9ĭؐP>fSloDdP&HO]>OU(ʖhgʟbroak񺐯\qkPh VK/M=q$S 1]~sQR-Ns2x烯KNKDZ8Γ QTE\ KP$s>Zy-sʹHij2Є?A|4`z"JH+rɍAT VxtUW#x?T1HTc411 s[ <pF%ha'ft$Mݏ9UMoHTl;'+^,Ӓb6[/'N Ϳz=6G 7cu{@zÎs[P,ix!5k| !ar 嚄i-&MZ`vCcݼGNPiYAN z*Z\'w<]  CQӲpI֤ ǷsK r & 2}1U7py{EFlah4T'K#GА,NЌBt_Q G/m1fu~=dfԔ@]~_ςpH-E P3 R3Z]љŐC$RAIR*E[=L"ʉS%TD'{V>(_,c&K=1`6:Bb?yLb |GHQcISC-.>LcvL MPdqbcac7JӜ0m]ښV6OH(ksA Ղ2n"Vιh6Q%w%) d"ʭ(\-Kq˭Ã2orVnOi2Z4hZѴcjEk@ydIe:?ճϯWޫ-[e?3ڨ޷+Vn n$,w}fD/d4g4"cA2,ҩMf}>Go6s{ۗ6RV; Dx㉎F&([ ~XK˭4ٻmeWP%$(ZԍO9@Dz dY~Ѳc7dopZ4MRs;mqfc6eoH+Nly%|q #3?rz,$" >N`wt"DXpDW; ʤ nx0V׉R.9Z=ZiUZށ,eS4 7J~h̗C*PKFLǠMݘ %O#ī\>^!b7qQߍ% NPƂq|̵9aP>\ "O~<;M`sX}Mɗg)DpuZ^]&hJw*0.ş3J(GhHG; ;Cst͎M(7.o1]{=*QшRPRQ0ё8s] /x P g'7J h.R%W3 zD里* =O1!COA0ʗZGX`UH D9OrenZUsȇԀ:d8 8IFDcrŸ٢ &|.H: J5h9")wa௒@44mP@HuЏy_qMFJ?@ΙuA{q, yd oh+4+E.%`.fZ.ۛf]0 s v1uIRLeK>:6帥;SޫQSЀY m"m+\O'#N/m `Q|Rge63 }ze[bk?kY cݢTM6w]30٤b_,'c/\ 6҂tۨؗ8=;;K6' Nac[+ˈB緀&0WO(T,1`Kj0hDU!0}L \jLXš{c>9>9Я`]Hz4 :TЬ+D{v9Vp-(sũ Ԭ}O*xJΪO5iy+y?}gc,1yvu837 4L3eH(ҥsbhؿbYlÕ"~sI~iBѾKq|7WGj~T(b(3HKPzQX<[vEhpn_ckPRVQˢw.[X k1aiVKEpo1 TF~zq>h-+**ow6UDUwQpz ؆o[3Yro/p#:giHj/KO@mfj~W&>t]~Aˑ?pnPtK.xhb׳uTU_&9F j4""G ޽ \1qwO2z4ʑC";Qf￑? sdPr_Ǡlcse}껝_W;Z][ynXҍm^?5cVm/jWŹb'C@?#j<^mP#^hۗvdʑ$cwo.7#Ωhp?rwo!-=`DB+)VV FݟǘOfF %:SZvD:k S[ǟ#0;mաMg1*W#uvλ]c9̸YM3%nۿ Ch2ׇWZ6%)^*߾3J:yӳ< ;okZ70771}`nJv7~$:fA?,/oCY:88QYyׯTr>ZB?3/ŹzexD^ pycmiuAAxU3>Oq蚝.OgpfK֛ztYIyya׫}ZZ2PoԮ1F?B]j G\ВUjW߄ȷYj-·F >^DŽյLa*8 ;C3}yv~7x\ A.YpMeԪt7K X-Y"J:I5`0`ˬTVnvA !0tf rlݞs?y~1',@(m$U  ]r Ѐ']ݓ~2/Ń;00d1Uǁp|$foIՎpfn6v, <nfYl(OU8#MDRy7K{8Pj<5 K$L͖=ƞ#.-fAy\e_{I.^ZNk<'-ۙf{?t҃bKnslDWY,6 Uӟo'.i̒ٱ^F (t]ѯa ""dC XArFi+wz%%ntU)'bB9{-e Λ}/@Zdm[2Xcm҃|k7bK]%}fYg7Ty tʊw e=$zP.DR:F>a[PU|eAm'RerVwF3e$ԫy.ؚ"lA3o~ĊA44' >2gWޙ/͗eRYdsO&4|SxwG}R:a|DB۾y| SL)s4Q~zR?˖\Ѹ ܱ 7.E,rˋ۬kQF~lKyiEs4޷{6LcPel3FSΈ#UXSy<sFM-kP kUDaC5>~Y Qq"d8   b5dvQt鄒""7]|*%HuiE)"Lj%|9c>%GBq>["α !f̕~DUr\)DHO*?c9wy\ا)," ==AA?JRa*iFbJs q’ii7=Bƅ$q48WXRĒ,tuC@`3o&ϜvwVyD<rn#fqfyfmq9UcVEX]x#MFa|+$UE8 "9U&4fokiBҌ/ݛH#44`'1|ݾŧr6 4WM}17ݣn{m7O>hw}2Nul,6vIt9{6B,=_7<ԇS_* +!$ġ Jy "vEcZIJbd*^=lc2#uKt-`/>~lY\StHiBW0Cf7^䆽Nz,r# 6:.v{0q|3-qȩCupG3ӹYub§vx;SJEB:H|>}XG8?ar;a$޵\ɿB&c3ج 6sD4o%yI1IvS]%3)FS&eTHFE,|c[.~4T&\#zhLqQj}$O+4"2K5'W$ιz+/w+^FzicL%}*9HVJ?'xOF>o,]Drgo/ ?j(h/_/G/Xwهmdo^ϖog ˮ/Fsq6q_!٪޿֖vz~g ђW=J-#\%.~QFP{PӞJz$rA#ŠǸCu.fץ7olkue-˻ί߮(_w/}ЬOGOaoo~󃴯xlك{ЯF#yuAvkAiMx?_h_<:bT:l}iVq׋ׅO{w(9]Һ 3{?޿TE) ?MY+ti&O~b:@{zZ]mW۵6P[eN K5~%~6sѻzm;WCJ;jϐm;W^4W=\cwzn;\!J' skVb9hԳfDt-?lYP6\э5#ח r]:sRB8);<"O //:g??+tp)1woWwfU~v+Cu8ӕ;GVY;{rrܳHuOm UDFrPT0.$Q0Wedng5?.7%wFMmV[:smsأr#~&sZߪ^hdQIjͨlM>Je͑FՖZiYeߵh8O/}8ԏp)ZբBl T}Kq)DUq^ddF4A o]~_ *DW-JHN eZXj2ʸRQhշ"9Ql}c-xH#zmmBySFj"C>yj-%JFgITcԭR :KʎF:MM侔rvɥ֌%)fQQT,N`R: JU$RK#cwȥB(PYRV"16mtʪHR,-u#B%%z]ZFY!)!d*-`ô~gE#4)lR!TT2<-!8}-uO_ZErlM~%|m==|c,GpK "A}\~8^}dᝏYl4e+:$dyJ)Qȶj8jm;n ҉YUb9c$pNTeAoZ\ 2E%d<"Zc(x8 F` v[6$ԓŬs֞Dqˌ6$%UVش+!Kb&E h 0iJ!B Q}4=V \.%Ɇ0ѡ^& F2UczC6r`%'DgBxZCۆuqx(&M&Jr<䪱ɖޔ2h4y27q,Մ6`1:)tƅ%jH,8R=SISi$es8is֢R@mx_UFdqzX EW JGAEZ6!i8֙pKE[E{ފ y)}ƬQ@q|]1F@TPȝa8I !LDMe;w 1<MR ;׵8M>) +]N}B$H.:4_R˨չ7< om UtSPunR& :UQ٫wFTR8P%g.$S6}p?z:qQ's8->J}}6Fx2= RŤ:"J,t˅:"0Bi5a D "=V),u惙3`cduCJhCt2%Ysڍj:L$E}4"(=D=;䍅""kTEP~R}MDJY"J 'r([sl<L}tҖEsipP4@؉lPJP/Q{)@E8F 7ܭE @T ~Jwz-{D_2K6Ѧ`A[{1'q飞˴'GsmW&h ` ơiFh%LGs icJ`Mӿ ; ,zOm*5fZR!ΣDjсK6fP#Zt7L|IiFhQ"}ÁvàD6PmJ Ce0Bk7幁vzA*c5:RAAJ@R9$TiS)7 !n@M[6m6@r!V"b%*cIN Q!GޠO:]FtS_Pˢ T@Gu125g10xsX:e@ 4@ ģtF$ ^'h pNLUY3Fn4 ƚ*Ei L>{OI:IL\@0 >Y ):9޲VKi|+ML1A%V$N!JVP0+EX!ˤ g@hBXU_ip& PH '՞S9Xg7vҜ%U0]C sʨI84*A$`. h#F BBָ 9ojWkhdo!jW_Mǧ'9;p9΃kbj BŒ'o^y,\+T`'C_\n>f؛}o?'u1& Awɨc;dA;FDuePlyF!٨F6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lZ}udvw:}IvƨI: -,:VH7Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauר vȨng:Vx!ݮu-vNұQYuAװQ:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبFwɨܨcQ _oAʱQ9ubuبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Qgk:|uS=;A31sd->£A}.!vWp팯"Dy\¾oWt;W=`wzA \!Z'ͶURgWtR}'j+pգ5[W=J'!\y^K [A;W=\3bGUR!\^+}F3q. WV>u-ElWpuծV9qk=i p'`=)c;]u(u럝Bģ?q_9B@Qy[ Q"#|ŚCćў"mqA6(YX&"%ٖv-2EQpfrfȥ\^g|K%S&xa(HIs2/~U,:w~,|g~}/4q B1C1I W%V܇!E"ԑr.gI>]}=@@QC!bP CtæHܳ}a[1r"'!%RD^{1/UHOF~Ie* 0,K:s^)鹷 =`Vb0,lIn.1Wu/4pSއe%Y( 1V+6BQJ bZq (fY˜ȪӦWŸUM1D6:c]W{_6aW˿Km 99nl|jt~>@u>w?}?G&q*%'O=~iWRpG|חq.|2 Wzv t{ פ)e̬׶:B9,]ـ(ϡI2]tM;`T>~0U|YS%j&͡--z" u2N%z9^5AMp@0Gߟo Pk)p͊Y+Z_&Z*h\9/=fa7|-!lxf\ t J ~K/.~٬Cv~y[F:/]po3*Xu1w|$"ƣaOo|_(n&S}V8-~q ٕ_54z%1yú3٢ї}.(CqX%ko.q|U=4@b4z3ez:ڎ">~@L*Yt{/ XM.K&W\s囿ףYӸ@Հ}#].pqa2 F|<h_:8Fphz v@iyNjbW-+JAo%/:"/nYMf88koF9|$`;m'E'?Yԕ\ό񌄏ׅ*U`kn06 ʓzg,qtý%pk7Jd0Z evL>Nt;}*v6V=I=/" B'tY/7`BB^e/7'ubډS]kJ6g.wdGm:.tUvprSr7{MwS )]\mne-? ar#w>wx4kwv/]2t]U"yά*Kߪ@;90j *d-0N#}ݲy[Tc'@'=:}U[Ρ!;;>=E񄷣FG/Gف/\IK0"L (H'!0N,6!އx Y=LsP X{PT_x#&TEl<_0(p/:Qh҇o6Ǟ~<fTiSbTJI" >nzﱒ8t`tʃSz."8IbO. copAw2KSqtD?y=WE̷زU1D~(o'~Yha^:J0"UHB!D@`QNT0X!7l^g$a!B[#SdfKar>x"- q!u]Hح՜^zdLI$\0>eN0 gd{Z f6BZG{I%cJڳ#Ei03Km+x#Tq8>~ ~6zgP+?QzŲŏ_ =)f4w}R?_y9XCxՏbV*:@x~L/d1[u|:\8{xHTʴsBäqNÉ5L;QʽBZ')MNE-'?ziohSJyy? (eM`éUKxԁzsLs"f$qIR% /K\[#Z@oow1<)e(l6;>˷#& *gCoqE*st ֖{$rs,.iJB S hU$ئL˵vV1yh㧆V=:̟s,swȣoV%>kCУ݃KڜPw~W^^[s45X/V՗K!- d ݎM͍&TsS&MgN0fc8  -b (qbhnB"CiXdH9(6́}i3vO,.Z6{5dghBkm]qG{DAs(K0֩14x! nb%˥ao)g<.=^/JE ) x/ei.(fgD;;v)kND{P7e7;AFTm,@[bbG30S'n3k-L3v}^0xk7'/77'UCK#j&\ZGLIUS# 02*a QZ*$wA9\{9;u42'4c89S⥤hRykcp6I<ƚNL'ASA(*KEB#H/@)1d xhE;κrS3u-smaԘdi`},!q-XRnSa+eq*ĺD3n&{T>zTAJ2FJƄu0XF0KBD`zʝ85ZA-;vRo'lS4@9 D.%AqiH# NTjSK{щhˊhݰe6zcdX5?Cgj(35ݩCNZ@@3*v\RRiwbHqI $*){inEy7do6?u" X;e^ 0%$b;܋GwcmE::CCt,K\]qbzL UPQeuH*PnǮmYm@"z,z_; |(|y^J6Ҩ[cl:"@9bk h)CN"L}Ϙ] [w.%/+8NC|=6ZS)1 3NDkF)騳 <5X8Ofo'6%Mdss4ߞRpAHE1 x0zqbby94;9x6:O>[ PME|2.Wڧ8?l9q:]CPQ9P-(f4ZT2g3⡠utLaϐ q? P@+) NgygP=!B !Ny`<i TiFT`SKiOXI J#.:E>kE\<_wS>\} MXc2%󣭦5hv]zo:@>d%b^Sd(ш*u=h eK!MDOc/y&X2~ju^MkL֕/v⯴ϙ됛ٟYYp}hqWy _NzwE iRl0PzqSݟݮ9gŰn|2;QQgVt`A hP"\T0%呉[<}4|xd?1t]UyEQuݾޥuH/eR!mgOώ H cSWR`<6AZ"XBR˩i 6ȍʬ3?N&'l9 (l/SmlD3. S͟l3Ah\#øBi"m(ES6v׿l'$<,{һ4>+z|+N)i5u M oRG!"$iܐTXsXm<+\˿Ň^ѳ%6KHB-I(355(zclLH~`lp$dbZ뼏s\BT A2 Ǩ2R}[Tdrܢ,ڮroAN:){ɳv m?J=mVr (~9fzB$ǚXmxSD=<ڦ.Kaf9rmFaV6B HH?P4\ {Z4 "pv|~6l,U@|9ֈb4֭(3hUcSjzj ,b `U6*)liô|{_y 2==YԀv_ iAEr1(%Tg0O2ˁDrш"sB@Ry gfurxOV?BăՊg:NeR̥C\s$vH-@Fx ??I]NMǫ֛t⿹ٟ(H/ʹ+ĀUHj)'UX4f#S@˧c)seA(Zgg2Ж29h+&C(c İ׺tt t¹3(ϑ39؎9M{GZhfYH2J R(dkt*!j5Bػ ,328\K@C[L.ʆm{ĢЌkD4qz0ss<#1#~M,A,}ou@w 7\ebj$έL{KNx˄dkZ[n`DjErof޲3k\CDbhaf R.MܮR6lThЫX*-SNM8 :*Woe~ܸz::&֝1fκ:*Yx9侊jn7/}'{ ؂~/lywЄ_O u jZ[1\$ 3Eb _g!,q3ԯYDULpl$ώ;ş Q+gnj,%eNhfddLkAjU/Ct yn;7 AT\YMX }:1Guwɍ J]ܘȦ'м{ДOr;"SO3QQĢ "U@c{|5 k٨ɻ@ zA2(DU@WiO?vr⟘eR䙻o8R>o&Q)Bkqom:ڪH[ mDgK4-saJ h+S 5d?_̪,%RҔidR IQMfajm0,*fN6t4tgmz9CwݺiiPva[,0+m;ٶՇA;_#gOp}n)t=J8#d~~Gtp77XuDQw-n1~GQ̔ FSvJCYȽ0HQ+b E%U\չ3j3gҭs8=1;OLIHE2{mlOda%w5xKEdW#"`f q9mFGU|&I*Y?ڜ0ELro ܠuC EB$O<>s>%[ fLEd83܅+N60Ac(SIcV8ixY[.(2WݿٓwN_l?n_e,v*86-7a8UWESkͫ{6nΫ㽣{t0/9ӁGK+.^Q=(n"@cT B?N1a0x59N{ 1G|oG5+KnOUeT;{3 ^yq@)S ?('#cv Aco~>k*YbiR\/+ro]=.omePUn0ygm.찵%¨@^?aq磣O@*# #\3XX=먻=+wGoΪgh-.{ӋS^79E#G͠׺:1E'*O<)XOk_21,OgUg 2i2Jevs Mr,vR71ZIևsbQ>75ÃCL ̙;ȑ,p(M{zu1V^F_ru{px,1Pq^UT0nuk"Y;{r۝ZN_z5ݾ*|x >ۿh/܈"){6GCd@x޿սQadS_V-mﲕ\]j/B'%!"y-/ mqˆ%hRE(Ci U[t]axGQw"+1CӕWw釫p/~k?hR؊UyT %/ZcB71_)Hk?Jx>߮G__Q㨲E}/nL,N9HnAAs":-g_כ4_f uO k@i;{E^Ҩeg<̚Z3 yd2ȓlIj3H=%EY_2nؚڪdToһ({HRio8:'Go> /:pтwuȐ%8Sm67L CxtWmcB&cB(iJ<%H);KTx ه@v#p.uqFP;߬cq1fY:0F ³9ŐUQfIg-*=sПop%=\c"nTNL +|*cʏ'?q1[c"*Yj%+4? &XGxh˓@BMnF>+ hV탭}?ŒO4\\?ݮhpF\v{ɳ^=HN@Wn״,1,=7l('ͭ#^A8NU!'Ņe*HZ&"%ZgٰQn'c)Pͩ)i5urFs0;rq(b rL`\Xk簶:L^5S!k8*F"W =\qIB(rEeлR3خumoEE$ UWa!$+{m$㒏~]޵)A`pֳU_>MMf0k'S 78XuCkaOkʑCpX @ ӑ/q^պeCKԪjzj ,b o#NuyVPpaS.l[_sy쵀/VCsfQqrNIXUgzɵ_AnmK\a2@! }EpVI-9,dIVI%l=nEȳ|g9Tt.2ʨa|ͶNBۊ:09EE8sߠ"~cp5} mdy).B۹l!Q Z"WZ&%@ԘX8Y@r#1lbBMkb t*Dp8%`hBTD ib>+#ќKF6ige&(ku{阷RPX4R:ILX dW%- 3H 5vbL K2j&ALM`EN}!pG 21%K{8\+PGJ9&:M@ko׊+SI` ԛT[?5-Q/W@./Fe6CmsTfB7QeWfEm\p-b~gWӿvdm7v[8k|1wAl"&𼋊S;c:XpD~Nڿk_4ܫ;xkW--fa`-Nl9l{۵prftf;aE}a6T=)2+bAK@}2HL] 3!P8xk@}DĤp !jbR RXj1xFzZQ+QߜC{їcPR=a×Qjmy?MVrf*D{ceQv숵Ut*r Y"7(~=G q*ۏ.vw X?b|ɉ0Xmҍ4X\CH9׾-_+koݕ)b/5 ]Tn-Zt MNDgǾr ^rT/Ȝ\` m/j8o9?~oj,w>^Kӷ2]^%2̒,|S"7%3n n7O>/a6OבY Fv(_pZŕ/O 3+cV<cY֕C;̌ULU;扆WSI^%NVJ- cYHۖqA/mGr$\U>x9![ZY@O.s[^ߑ):pbؑtN[6umoձAz$L.&1\2I~D7bo) u^D tRסk,)fӧ:P EMjx[qR7y/otG~-W/sYP_<1i|י4ss˜B9"WIH9R@*x ,JVD#t%SӏD^r25O_gr(y;ԙh"A'Y<[B*,yeZ *_ ؈d% cBpELQ*DʍhV+qVNyD䤎EvcC8IpǭTpy9Y@ْ߉c ^<ۤAxi >;yNw7ri楯<<}ץ-WS>`˝8&`M* e/o[ACNA{VDut619W>Y\—pD "AQ$"Pa` CA(:Y E :dAuf"*8>b3䵩nIhb84Tz]U`Vc#tk6߫ nRKQ)kԌ6|!gH!H X%)uFm<"qD9a=^{N+F"2,|eFY4 ) M {b m('g1u?2c@/;Qȡܡ szNЧIVK} Y@/2@RpZ3,¥! Y!󄄚 9qZM!l{a&(As(Hg8 GPnE-36,h)BSk{rm H+Y)lFgZ턆cd*g3PV _1 i4z,gOPBaDgYLf&z=N烂LLXό'!JʘĉL< O'ARJ*R{P/6ޅ-U&(;=du1G3Tl[J*i aSat}Jt hhO *bPdilLV!]:8 10(Hx%J8%4W]ɼvK3W\w;eOj#ՅJʣYH)@(f -Gei )/Aǝjj[=v7GQRR*ZX.:$F(qL ٰ6%1H"%Dpደ[C((8}׎=[NOt8dÇɴʀGXa@1s2K$SMyc^2`rj7aIHeӂp@ĠL YTL!b/iKеu!=訬-a/9*!3`,g$ Ӕ52JXW\1y^:*UN;&]5)v˟hDT]xބ{ks_1mL#@O@2IE]Ҍ腌Y,Fd,3HZ:UG _twİskxMs8jG"qoMrrQWfc·v)L{N2?M,(8KӘUkm¦XTnprւot6[Ss-'Пk#ƭW%Xe~ld7nQ Fqjjv205l`2 mq 4ri> ~7nu`K8ȹF 6~~!㼚 $>^{⽸z| # xF&G>-y0tj vIq`LA>j_4鄏ks@>h`~/NƣɌ>/o8ۆ]8| ./ 0 x:w?'KPDˠvJÞz*2XLa8E߂U)8|EM9):NE> |Qƀp'υAᓼ}JUԻ"i$h50b9ĭؐP6l r^uz%N\H_( '9%ru*S6=O|ZhYWp 'i~s[>u VKF" /5ZŋU_ݏdWbrGTpf^#wʄzʗ=8 ft[Ep݁gGcEco9|< ?=|[s3 ?› ?n2@|t??{4͑Ax:ϾOOuzioן =8[cIXIRʴĬ%v^)WP>wLP\\e{AqbKX!h]r{7;,Hk^+uigW<'hi?2y^VnwO2>[se7wj ?{qfnoNZְ">>/th j_\BXs"$G ZPr3˅/ \+pũEy,fi%X ʰW\ePŴG L^-M/.WW%7^_,+y}"$x8#ٛ_޽r7m>Am@5N U/vѻ/׽lR.w5N[W K_:?L>9=cge!71u6QE.WAPM O' X*:n> w+?zuuTm֏vkkotUϠ.c9>~^mk[swe5r9ő(JƊ&)4M ($vFjjr*Ir$F]HϷ 5 @=ܮ4굽j@?IIpLX 2N`XiVraI(.9s+n}<;~{PWFk;KEPvŝxojv8 WFp7­j^ݩy՝XjZamowFR r&-04.Amph>:KsgsM:[!tkƫo\47[{߭5L-f[VPE@9[WAԪ'8~Q>jlUkEpU2I-1DC@+(,8mv!ۯۯGvJ-UYm|/kVVnՎwcJ^,7 b ެ Mbpͬ H䐌MjEUzZ:zJ%xT% B2 TB<.jYP.!p0 + T?Us"T0%a-Kā[) ;( >I CD$HF5߭‰?ew/Zm.TK0\zZ-L%'Z˴.Ƶy@ }{>y`7 `'=ͻz+{e ۃf'+e/2׷qOy7e&jI2eNijB"L)bI]PQɜ 8gVUKLOzCƠ۠"j + X_!od'qk|]ՃD%>>U662}Xt͇>wAƟKIY+`{7e\)z~ :Hݱe_i`L&j3ފY84eܝ+;M>LSxBR%D'qpS H Yv,[e xILc2+ ܘ\/h.~ &pQMPNK 0(' |=LVg;Fa/~@qW0Q9ߤxck噊{ݴ<.4] _=o>P2jBMe ݳQN4=wv.qLĸK'cv O(6t+Uy4dG] R>3# @̘/~.:nlG`C#DG;8HG>#7 wʝF]=2JĢK*W[9eKa'wގS 6>([]/ePV(MZ/+gg~joʯ?[vCܡO]so ֟CWj}tO l\ON8a}`#NZgɺ1ImIlhe,0#!Rǔ2g" ntGG!]!T{B}mf,ٷrFrN%"[lI86zxhD~亠Oþ.k?9l,f|..XL2eZA wZSo#O6r@'3<]S+FgI<^9T? סuDDoQ|‹f|qצ?c٪by8& Wfcf`Խ|<+RYb+u$sn i?Y7f`ZAچUpW~YB"6:H۬n_S]IlX{|rw,(h!Oo]Y?6G̓Ns=gG/rLTз3Yt^ M㽴UOHSH {HVZ&Yɫ-kb܁^W~{}He mK!DEX`WB2{j*4{*(GrMjT'.e8%L&ĺ1|sٯ}G*VR}3/FۣG=Uh=pr|A?x-tx^0nO N⁇x!xo4Dyע/)^ݸ/aY>! f0ujǦG~t>;G9R,Aۥf҃4mJ㉄|i >xk58r{ZuhG)B+$XLs ց}"3p6K=W2Lkp7p ʁzI@w4Z傀E}o.eO3"ɲV) MS.K,ĭI6ng=AL!$+e7ypۜ)J~f W@8MmJM5 TTwa+m9; `h!ԬsP,D!И7:IP tK Rňi*EjT*4̙A01.!%J(\X#P#L!ETڧ {mmtRFXV pyZR%q$T2}"B;zs.`\R[mKcr`JFDIrJ.U81F ۨbd(RV)w4֎P-bFK 2f  F_$h =ВB""j`RE#K(C,BL  Y:cR]4")BKl4F) b"9NuJU#\ hD$s"p]R#`Bb ;pv~M*bVa1eG# V+_dHT$`[8xaE)v[>#@N%p+ߎ7, Q(HD~zWO$( 'h56V&1jU 1FX]Xy`^PJ"@'[U22Kq1p*c"j ok?Wm!pQE]u`1@JRGO ۥ ;[D\5 $!LεL  v}o= 0 ʁ%(JpPG181;cTR˅(D4,8xZ,Q)࿀#lVAM49x`@]Q7,D5 JOk jw^8%">vrէ X08([B}iB5ifȀqeq`5`eI@N>npb D#J<3%L/-t91wmYH[*"awIevA=-4!) }O5Elɾ;bv}[uO%YQIՌEEr[o-̉s$B 紇=&s`͆)p[VH"jYbwuRZlSׂ%a1gWXJ6` S]V:yћTph1լƪ$< Ą_X"R :@/<4DH2ӒBVrHޓXOZb{)-Эux# nAd06mGDŽEVd2P?`y";*ND ?&S yY J ; /pmQPٝEy->V "+paC{%o}pl/s 7Dbmtyr [0Gdy  B} hP!QBZ* gb@ȃ D:lԱ ڬӃbF4f,,PZޖ4r|`-)@u"uQ82gu]"ΠN`&yYdu4< ?x-M "wGzc)EBGE!0 c:̡gABuDtIPRM#B:b8s4YxTn%C+,S018dY">/+i%QB%ƐN9 & b1'RY a꣨{:.JH=G*Fh' `Dy;?}Aa+^m]ADx)R'BB4G>=  s$%\YeDw )0HH&P躛 B) j276Q+$OW "U; Wȓ˦7So]m'L+F.luBEBJȣj2*aWot ˠ~ *aAĤ0e+r `N,ekT?R;9֠QeR @zz^ IW֒UQAw-oy]%aK}xIk8\.C 1hȴ12 j`)1#,ae%R ;̀xQ<=J/O0%7aG9>d(&ROiᣐX228a&󫁗 .2 m >%|U([.."c`-ܬ'uF4BFCw]HA@$Di6;@s=Tר"!HTGXØήlQ{JcWmU1aۿzy.׋|S"!;j:Oԩ)!wHԩ&Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uR:Hou9zwcy1]]WC2}ݽoqR?w14a!>w9vuu\"F0ڽFQ־ %gE#H2CWE("F1QD"bE("F1QD"bE("F1QD"bE("F1QD"bE("F1QD"bE("F1^0Hj^?U`#^QZj:R^~St'HSwp#'8'4"D9Fֶ}//G*[v?~Cvt^O!jɯ]gsn׫dn-{Nƥ=\vJ/vxbe#w/nOG?^_mA߻kof/~y/1.9,b{Z5Ǹyój N&G,כq&ud]ש֮ymbP_Gy[V@.؞ڈ}FkBV>:> }Sbz<4+(3lR]($4 y X? |5|vgwxAGñ.^Cy>8QϞŖ0?r'@ʰ^Z.4Ϲ;g' yT,*]sy5]^cp`U޿nVw.tۦwLhveZ[OyՆ'>g#mϕv|÷4=nOphns\{o$ qZQNƾ¼|SVPՑ˦t_sOxkR"~873>x">Sn߉./Hsqm227<(̉մk [Z~[4>\յ۩`,(i~dJK{Yq5 >Nݻ+;+Q}XF5_^40ZlG^ KY7ǝrkwnw]\ڨRuEv1r03wjw`IIG3kM̞/#!yv3IO@sѿSgf/OX%T+TBZ%KY8RӛH˯Z]e2X&&3;5Yh]]|e"F["˷SJ;5>$ԣBаyp~E!|\u]KrwӝWsՄJcee-YGf-|fc{Jۭ. {7(||61sEZmz &M:\n]Xɺ,lj|bݦGΰPi/XW!@'E_2~wq<*iAQ qx>hd.Nv</zrnUDNϰ1W,߁x~2W"SUU\|_YDwEņk~`}o W^{;:[OV`dVt<}T-8'+?P~0ՔY>EFT %ԋSV5*qM:9 ϗ`ܼ ٗ`ў%9Sl9STBof}|b ^[ԝm#8sxÓTsC%ziMѥކXO{ m4,sLqJ06K]68Ɯq=\{WaY}HD۶w|wۇ&g%Q{K/{Jct)F{=}C]#O#Gw)jPvIY.36&LQ&osfѱ2/$ oSK\ig1 ,a[0E3O4R!$g ЩJ|u9\$ߦX8A̳e+0q(rV9[&N쥁mƄ,_]R< &Ήvտ^lSsDQ+ )&BL1(D]!ȭdIh#זW) &c9&pQ৿Xmq|)Ƅ,Xa:3XI˓$ (\pP+.''Q5ޯ_yq8H" Vr0!RМpl IqU>mnaĭkءK,ibc= ?OJncŘd'VZxgRRrJg9ФͶDg-Yp+"z!{wk`*r񙁍Ϭ@R]S&]q F(v/~hhXX5KF 3(\`RyɝAQT`I9N|L+{sHwaIHeӂp@ L YTL!b/iKD@tTa9 #3`,g$ Ӕ52JXW\1yN+V8혜vv8PٮWQuvwp%v?hB⯏ 8&[C kE<҈e`RPKSHN}y85Xѓ]-eӟ9M;jG"qo]@JRNA3W,Lc (9#ǥAk14忿:,ˇϬ`)n,`&8K[< lŐ N6?y&4|"6FB)+X8Μ6éh1eTbeH wꟾ1ryRKaO/wlZ 'e{'p:c0G*R)y,nAq^IX0cXe\KF68׵\/5FY#KǼŘĐ1DHb!k/ .i362Hq&$)’ GO5H|&CJ&">#Z`NX2>g/o4^aבb .2zbc&&)c| 2Sn'95;= ZtuMgM\%i5l@2ΆBm׼< `i%m5 + ǘ ֧\/&H-CQ$TA} &MY4P_39GOL (6kj J1*JaM#MiA}/D} {ړ^E0ju\㡥Xkwd|ZEo!o|ʻ.Y䀯I@!kZG1-س%`dyK-gh(qB;$XMmS.hiVF9͜o ,Sǂi)cޅh 0Ne"Hj9VHtcm }>,uɗGt6+z1lۆuW(_nڬW͡vhXb#VG#$K904EKTx,p[zI[Q8j6Ӯ廳w-CAESbQJXWN癆+dGq#X%,F"A;䬣'8nw6M:9_a39Ճ A(czfl 8 rIē\x=qh})CTbЊLܹn{WU~BVEߜ]A#7 ہ4i7kد-#ъ-6Ϋq :}C͹efeaȎ솖|je>5Ai28ɷƒ! 2\ ~͌]lpOq v=Y*M:6 Kιh>ҋ%k+=&S}7yFZ8Γ p+_a7 :s dh2j%O$,Whrr~̛Au;Nm0jSU- Vxpf&,ڭ4yT#ki0hN? \Aog_f9X 6;QS( W5üb_/,3Rzd?' 9h3Q#5"ʭxjBL]L/ S5)d2dsZdыSR]So"E%( 'SHPK쩨_Jrdٲ;˜$cqk$26$<\=_(M珌b]r`Wt ~Ծ0 s2Qx<|}OWODj5q%-2:d?WymvnMF6<;x Rxd 8'KDeIR^p=ӛЌE >CTDK${8-"h]*#>&K T,4`rC˱ĉ(^^hFJ\M)q5%ՔWSjJ\9[sVCJ\M)q5%ՔWSjJ\M)q5% BҵCY;xrѓ_٠r:ͫ/V٠aFC%}h퀍uEЮA¶@.bP%pw~tPFPb3JJ% ̙CNX`"3F+BB@P*@egcD1|׷E1?.* ێ8@-|,F苃;U@,е"ܰj;b1Jibb 悷y$4J&YìN*Geo֣SrvK4O']+?".O;֐lR-hfG:\_u_ʃ Y>Qn%~'yݧ[Kھ<ȽC&6@LjVW0 S{CT5skԴ=,:_ͺCh%j{Qw:JSAov0.4ga_c϶AWfٳ4_4=~_ Eއ6QJ&~2y?ĊE߮+]|ӲXM/} EDMsopV1 .`A+jGTwHnm몠O˴2OT _ɬ&QCk&܈=Y6`t& .A,/9% G@~_o tysiA> /ܷ\ۏɧúFj<찵D;}-E"ìeo ߜ$߼zem l2|r' p`%\vD+dւmGMYN —5.3 c|;Ь 7fhy]0 ҫEA=I Iqen=^;Sn]kor+B>;v@H _/.\cQU]ZҒzTe&%;H ?dr8왩S}5qI1G.'d2FK+S9Vc3DãGN ;nzb'^2umov@dr .LX-dض ~>0nugθTicntu,Z޹Z?}|9( ;d+dozts zMm.Gh/4Nۛ4Es[9=|w y1S@B,dtqSs1CsaUOLmpԲ>8gSu}#ײ?g掓={) yEU(D$мcBlZ%_1hNi>g[=W1T_ O$s,btNsmiZN!'1ѰKZxwaV;Sp/Oz<ΚA6'9-J6үg VE{PBQUmkȀ~/TQTC 8cV9 ɳv*JP5y@jcZQr?vR}c#|j0`qI`A#e0TQNВ/>hG@Fv/w1zշăjWk:]cSi|V9* T+ 1;C*k"I12Ԓ:>Myhڗh"r, K18E1R?=ߺl 8WO3Dknh2ij,' i0)Η=qp€ i ҺiK4 I&Ӧ§P$c6$*BFlae~/|Zx3,W=EIݿv"s]W~7xw*&R"ZSa_^]ehbeO9Fe 9޻/5v!{Q4FU) +QYGQ.@j#0n%?#5 j)`Qk Gut=Sxϳr?Ln\XxlV'+lL(s ^!rpMX*HejJ1t@e#U" ^;aIJj'X&bUa%^+ 0Shc +oNXwZTTCԖ3)S.8pr+uh\+&7G;::NF @DWE:h콫g_x!B=Ŀ;?zD}6,8-cdz6FS!MCT+ArAϺ;i択]Bcۗp/NM" -5LڗT4p Ugc=z# SaH(`8|緸#VlmLx:fjSdiˏI4tR_~x>A_OWTݤztR5[nkj xCtZRWhZŊO [jȆh(A0/Vgڜ7[抎KHf,WkM>Zrf/Ƽ5dKV-gqVK1Yg :íRAh/bUit/+~~{?]vlw;bnkCK  |:PuX4stL}d3q;Mͨ+4#}M 5# N~ i Tc[ix> Z˫flEBAKzg6ц'SPW//9J0dPvq{\YȺ(7 8 Zρxt&d1tȍ vhȭ9 6v|퇷{˃3't՞*k6939%Y*$ KQk%-?9ي7d]^X4[۶4 TL lbDWb j-KՌ7Ds{.b|V$}L'zaCY쭐VL{rXW$GT(QkMLD>ߔ.jk:h'C!|Q*?XHPMAp@Ҹ@:~4:Zˢ6e bˣx(T<eS`$Q^CUӾ^c@_{8[!]-~;A͚f֜^Ռ'u]i>zݚ6oֈwAWt7_ce-瓙WxCW?gY՗dV'fT棏ySg]Ad/6/Dǻ۹[]ɞ>* L-ӧIgXaW)w5y~zNQWNO*OZ( EA[̠|5D7=jROdtھpjϙyURPGþMڀ$ۖ%=%G,E{ܫu՜d;xSgRm]t rAmpgX7F mGŨwf(qpVih`O|Km[s mWn'e B.CeS¨NPUL:W37-솝^k?8xOYժ\lD&jn]a dK*!ۂJiJ& Jj]G  ŗHB - `lɃG+݄bB@VؙBe ^Ewr L=@sޮdFwU+sD"EB9BZ &$@lbr ȹoa^{U~3ハ[~aT?܏pF^ZeCf!xVWc v?mc\Iާ?9}[ОW_uFyަT8߸3lV+e~8^'gݯe{s^Zinڧu0d={XW< V&Oa?F\^sk.{e5\^sk.{+1<#{ѷ_#U ɘflb6 م4xEL+-/ҭfmAky6-)띭Mpdڌi//9J0d`E"T*-f`NZ)f?]Jgp(}NIe 2{;6!a W}Eb Y24JSM"8OUHΗQkYf p! `DY,x\vNߴbEkzaCY%L>S7) 9kZS"#./"C{8:J' )UI`03h`Ζc5@HT2F,Z.hc] 6<:BœQ6FX@[P՞JF#4\I!OCK`yX`,&,g}%Y-6eKNp._UWJ0e` (jJe8nJ+0h(fx=.1daLٞ?n2)3k>~^]~?@3Ic Q_53FX.0c|NwٗF>Tr&Qf|X} jz0g'7mX<$я6>P1Gi5qX#p7v\%rf?K44MˋpXsTP31_}p3]>],Fij[!)e 62Wiҗ0jfhS\#kɣcst")DRc*.jp:/E(x־?~yY2[CTg[܎h^XGZpb?(>2.;]T\ָe=X Gs?nѦOh  f=)\a6VqC׷nBM{<'';'dFw !G{*(897Ժ$G,ϐe'/更9a}Y \+4 [gN[W6[rV2@%aO &1f 0`mNNmvr.JĹ+ya{NPmE貳p޹(s[7vW`G.DTT.$M4seМy4}in#S>_l~_kמW()Jo ХOd~DWG.F>)DeG6)J Ԟ|BZ@vs:}!#])-86 n1g-'Y #X}eQ^F Vi Nhƿ ]e\Lm>j;5@Pq!4+J3%Yf-FPX37PJϭh}ɌW'R4q SAtq4+U zLq5;+z3Jq1LF1*HT\g}]橜Ud&2RG dx(i&Uyu#O >d/].QΦ(T֖j w, e:gd`L+zTXC{9xxYwnqi W N*I,*DKmYZm=vn$VtS!S>Z[ P UD*51rТX}ձخ|?i-Of{[@1elL4X"c>C2rʔF^RM3cpiR)i6XR }I1w&-ŮԹx gӻAGrxHp\I7uݸ~Ą5w,}!eszaf2k kX|kG/rҚ99di| xYfl+\\ Xw,G>MO7&]9P`SGIz`'/Z#f^J9W/| ϣsܙ -&?0b~L鼶shŚ_=QGAE 1/jFT\ᏨvnYKj:^Ue٧* -ag"et(l8T@@͘ё\ߡ]8.N)o$n缎ra(wN''b߁\~6جh^^1i|י4sKÜ>|w Fړ#)4#*eJH4q2) g\ʾ۪}H.N纄ĂZ".QY!M-2f)C6UpT`7P .FS'Y3eBՁP(|*C"1H"${BLAqN.2\=,wRk醤ޙ|Ou 'U<9B%#ŞL9ʨSkBRƼ ATIU.63Dm+-u&! G$c&(F0?nWk9TESI[+ڗaƕwخ^}F3 {p竵'*FQ-f}mJ4: .rqTj/\p*q`l܃!j<؅Kp/g3DRNkDz*!;qnsPGmy| 13¥7D acE[z>׍=(ASiQSP$k|k<H5×wʅAfYje o[8z?eizѓIZ_t]XWoj_q~Ii)ȯŤb7HG ҆ %E/(Pkq+\gӻv16Y_}[YbGć/XUR&U,G1.G~2P~ l$L Ϙ>WbzzD-Umt_lΌj85ğ ~4s4lfGlWXNE\6Qˠ%dԺ\٥9L5tʾ! JUL""Fc$(AaD֢ۨ>@V@#d!a92dYF6jRR0aRL:ifD*~oq}'\_ E/_oe0[ ,LEMNgP K4[;PξLlh[5S.? 0ڞrpy_ȗ%j"n %AnPj:(j>?NH` 8qΚ9cЋwD2!J Q{VU2vnhu}EOKHdznmS2|h zhKZ6Jn2e,n`A/w{S*,PL]tWУONM>FENg-K*K:4X/Vlj|qr2vӻ;]Mcfj[FiV|L*_on&? W@F?zXHk 0t䚇:fQ Y'1c*^E&'|W88 PkCDrMaH"XF:E)=IPZ(J楕 e>4Jd<:^S''u c!Q|-k:nVWoQ*dC'X#(9'P(J嬄a9,fB ُY$5cͯQ|N? 7x$}/9cթ,O]Xi*= &`EY>U僿ϬFyf4 ]f~_%{j%$Qt%Z@vo59guK##fv31i(=q}oqX Sz:ZlJG0}@JX^@/"uE< rڥ̣äB{ I"זEn%EfEN&Sy8{(\kէYΖ쁦MzޮM¼<{<>xY[⦿Q7'J2nFwϟwos<=k4 ҝ\i(}͇_&Ɲarիg>R~e^n?:qԶOvWo|jĞsUDO sվyCݡ3q\J(!ZhTf"'Hsȷ1~ڿޫ֚-##pe^6T6FyRuEf#6r+yg뛖ق73ތ#ƛiumt-}i몥WC~Ee/spZ/\,%f×ut\$n.A#N |^)5@~Aqx%vGukW'q9,|]wGdv`v`fd:Oo 3qac-?L٫ŴSCzєښ6\Fm W1y2{olqEB./ 7Sk\~ fKҷm؉teK37KĶU6vD:1O[&LfL(<2nċm-Uژk'Zξ^˾>|gQC,jwNF ' F?.OE{7.d,kl6/ho[GcrO!ߝBktpan4wQJ=} 625'Wр3Tda>`aաIt!}Ԑzu )YsB$>sDQ1[Wޣ&Hf] ΀ٽ)*9MNŮ^-I1>y~>v"9 B6'B `QGYb4 b+A瘲QasSewa.T]<]Kgye_)i ^gS U1 c$PB¹ؓ'X#P"UY璍5E*T1hqmh\V*]{)QC`@?)4Vi֝TՇ^Tڛjhc旻B=رK$$6Zx+ IVXw>b"D}pӫ|:)ZC.7.Q` x̓(DEXY!JR=";r!wy+Gp'Xgt%= F;9bZZ YH DL)/AY\!g᠊Y. H/]{eE;ஐcΘ^5Y9~]^׭=Z5πc hN1$e={M)$mp#е.1ЫNM':B{>OO8gCQ؄]Z}@/\gN(uVE:NQ48Ik}>euEb9J H<%IIP@88䟗]mg';z|~n5Á=}- |lS!Jclh+*opR2YjYٕhk"B}6ɬdn'%kgo}<((RatJSxNF >e"rA#&EG;xZ2B^#.~։5Ѱ":淬ςqd9! ;ă;aq%6t&w)dŧS/A&2D2i+VQ5FkK-xp_G|`0\l_K).q Ӂâ$#?N[a?K2aTBC(&~/PG *Գouef8Ź̨iÃt}b֩TJbJVMң-DLf7%“u-|V@eŠ66b4 ֜T&<KZ!]FDgB =[{GNpn,AXЂ,HcrM6ª\B`U Z`l`}###Hj,J]Q/~u+-#-z< o]yۻ)Ms|"ӳ2zmg |P MԿhMW酪=z:W} vؾQ\YD:$\(Fro4Q?L`Ơmil3r4ԃEFN&GWU~=fP'I }sRUdw|4 aU笅x@(Elr%"KF$ !b^gB^6[f^8!7kMkrWf`ގ? kY9@Zr!_eqBH"@*%rC7}tǯON=|4M?-j;v&ԝ﹖ht_{]p`V#_o wQ.Ikcz9jzuPsYo~L> Ш)%EߓYU}{f;9" ʾ|pN¨H)lu\%.<( n:9U51)Y[RrT`iJl0K-L.Qs֪sdHTc cTťy)Fz 5SW)' 2SQ2rk](t-@ttN$oCp'U +ǹ1!!:Db,4H oMRrA` HEѺ@tPt4M9`).:LG4D\, 5d sF\z(-VEA;&' j.sɹ8X!ڔrM_]djd`[ oZ\Zߊ+h{gu3-'ç+0|ھabZZg(WAa>8Ljdh ,tJtV:6պo$]쾋6m7[̹ř Y>5VKsn OΫ, ͺӾݴ(dɘ}y}36`}hP40 Jf^cg}WV $ee}-Xٓ:{:! el|`vu7xsKDCʣ]yP6y2~?O:@> CD;?א><{uCz?fLmv x \.i.'BȷGԛTo^D9"p ;(6eKFi6䂪M"Ś+Qf皦qw76g_dUK>L/;{x}ǰk.'d➣΋@=Nq>>k^)V9[˼HdNEe6yX·15EpBb&{!A u .fwRdT PCc7aYǓYʇ\v"K)zqEv~Ĩ99KGPOJ8(/%+10Y.q,ӕp~ OWfu*hRPȕEDP KM0QjڦPv+޴}9ٗdrtiky&A˜S8,{31gא!z (tLi>|Z:fk^QGTe0IkV:iZN@<*r dtE#"":R<.sNmcvw2K kXZ#5Z:s.z6u771~2QL#G$BR[`BNH\H٢-:5VL^ *G10}u{ɕm$<N/J<~j8+ tM+\,7+ϴ܍uA,fGMnu j7f%N~>kˆ?nk9R?sBA!PuV9 &KI=y_3A VE;ޗ>cף9å}|gzȵ_2Cb1}{0 yy hHJN RIe$۔-9 EQd,Yx[ٴuKM‰~4vƭ3Њ)38,:TяI:f0.?s<8 K;Y%Y~=cCyx1ỾaqD&`[ L & ӫ*a6MƷWt׫86{b` qt6N`b z:s2N qw9hֶ eV.#ڗq}d&:j9ͭ+0е|H4 :hԦ 8;;} b7e ,}_ #s0`<JwL_f:;lUJ&Q4OO ҙp!I, Ϩ 4( k)$qJ}{:'E8j%* Fj:逅Q5 jN-x"!:hT(SKb4qnJ؞T}P6##?U@[05]5]5]5jPүЮ_5]5]5]5]5]׀v h׀v h׀v h[@PCy[C۵n[;N5~P;NYj' v­pk' v­pߌaAx[{Nn[;NnaHck'܊j' vpk'uv­pk' v9;>~q];Nn[;Nn[;Nn[;]ORFEq|< {A ˽j9( t{7ػp6\nN.!\Ȅ 1a=36p9a$NdIr.<Ş8b>QJ*Ru?տMh}o{.jƙgP')g_ɽR^nP1]31<^g2;mWMYπ}Gbn-϶R Z/,`,$CPlfFX0g6g^gա$NgNA^"j}^h3y21lf8767Eq~$){ILˆ1F"cCB9[զ)(<.kQ<T#.2r}ի|dr> AXx\aE_놢* v!Ɯ tW 2o q< 9,D\ KP$9LEJjk\m5=Wrwc^2O;/qG.crnә1AgsLO6m]Ũ}CmHPdo`6 îMhH_ߡ(hwL> QXa?2pR#?'Y 1O,+[.n0`4pz/зj vj~>2j+lfξ;^JC/WfCU|s|uwn N|a>0Oד_Y F(;/x•/_/[]9sgVƬzeYWX)^)6Ty^%NV(Z6"/-!2OwNt*Im{Ze,Bnfx@хNڟ <ĶU6V :*td0&L%4K2z#SVS[sĵg:34H>8KY&>5TAJDŽXK1T2[(S^j8'i29#cѸPhNq+#~ҍ-0:2lnHɓ҆({y#@wsrc捏«}ߕ|ŞWV_(+DǴRL`uL$$h"`.-3ΈHQ6B+=iJu:ГRKQSmhkdh!) I` T+nLxk2_1y*̷*ey{QC 1Håd0H YL!1?yLb |GHercJ(ܡ s7MVގ Y@R2>gY/1CQn*wa J츭ϰڠ_]3S( q4!YZZܠ)46u=n`\WͪJcm{r<>pm> ByAOi8{E )h g:0’VNVg%"xnaIRKI0ERt TBs2:NEH<(foABQ 5؎N־Zl9G$ Z;n&{)"I)"勦;^z>E)*ZsX YFiBN $"M ?| VTyG:]W\dVϽ#7_x$9Nbm\y:@ ";+&0L`WmAs;Y4_FK65EV^oM`5ncЇ/,uZZ]~tJ숋Yk~,;m+mr"n_X]WHf||+)T_މyvHxU r؈:NUO)?JꐏC֦s>NKPDˠvaOzqJY<{OO9ɕI1HdlH(:)xʹz:+,f3XQ./syL52dG0d0)Ytf(c6YW^a&^Mbu 0s׿|)|rE"/ɺ?qE5{JovK5-wBJ$kt:'?u:%%K2 t~ n*ӞRD Z3pao1}K.̍ ܡޑɾ`#1_'=qJoNB⻘uD \t/0.ga/h9z0u 2xwĻ.И49[.hz+XD&[S2e޵q$B ] X)sM I;W=3HCRRS"1͞zUj*wCG5"B 3|=ϋ/310\D<}ӎ!Dp: __ÛPI/JІZ+⃣~OoT;ˮp노Yaun`Rֲ |h<%nH? mPykeޏͅ7_v548B^ˈ}z*T;%h@$ s!-ң$% }@b9 R `$F&nTR1"l-0v WFaqqv3?O&MPשX++E3_\Ge0Ľ;x;X: qOq# 4$$]&I_3?18!*!!H^Vi GKGZJaBr'W* N\WR^K82 *c\H-Qb:%&ɜo7OҷC!mIhK+/6d=ʹ;$a{: liCz{G,i{sD'}pa@Ɏ֫է]]D!ꋖW3[)JƟ =;Է`Bi/+$HQaoA`C{墽݅qn#Q/2eHFYGꐧK2y2ifY[ ~n?J|!{{=ظFfz]LdWLY  Ia,0T0-Ⱦ#a8B-B0 딨⒁ʥOVI=<(y`t]Ǯv}cضC|*qsP_~t'RG3ɨs &(9>h,C>,%W[1r7T& KmB4[&H] EKfLz-2Ub[qp.`2gg_}C]u5Mn7~7Y MVa@7UٜM2$I& f t=F=!#2ʾ30p<.Vrǵ0-KѢ !^`$ DUI6(#y.o>E \h8/P* d'L4Q94h Q[΁vR [ʕGk+I5 u\u g}1CJ&W:^zh#*唇XR$4AbcQ)"Sz5B$lyX*ƛ:^X$P u[oVi-yBuPaE#7)ťh2sjX}&\(ڨd-zf]cCXd-!F 7Pan@VGٓ.YSSCDh;#IDD0E@`=b6KCHe! sw#J9Eepa|ouΘ%Q%74fRҘ;KcSx>'/eg!(U8`ik2:isZvӛ$Ϭr[ ghjAqZ͚[Mj́ iX}vM]zYMӯ ^^ocۣ"HYG$n 4o ?b]V{1qZrFq8 [ivqKL ԺÜ,fo_~dں>mw;s"nYVHw0J('\^i9f\S>\Ƒ.FO Dgc(/W2N 7C[~LM)~m~]<iQ e򹶵\GԛfUxz'GLXj͎u߫/@%6 :\dU]5 _pvvľ^>v[FȹmGջah_e93n!̮eam@u<G fA9DB7!E`cL0QHYxB2<5_/Q'eݩ:QFTH, @Pm;)9#vsWT=2xپA 9R@{Ҧ>(̡Q/Xy7U@vB>}mdy*E2BpaJQh/-S"jBK5 W3sX\h{)MLHJhhBLӜ8j>"!gYĥeD:ꍶ+ǽ%Ɛc Br(M\Z2xup Ta)2Hl;MfI1Ag$h}؄v]p$ƞY,25O7}>30i s4餹RH6!Z&ZhS L0AMiic\UB]O6Z<__+ u?oofFUGpX+3 ZaTx%,qUh;DO0I'[7ēC8.%e@.W4 KI !=Eԩ9JOIh},}۹*xm6Q~x N䞩@vc|UzX=bsOyxN0 %YiQ^X}_PTP; %yz5kG>W ~sf&.TPm$LÔ!Ái]nthA Up[%x$9sr>q :~nlO9̮0Y+Q! gd޾xUb3eyAt0~T&WWY^m`\_j`<߅76>orot>z6崙S  _{1nwVa]Pwnꜽ[JӳNaVdY7=ߜ$?n*:@Hl;SD0so?Ք>{}4(zydT\U}. 0=zkiNSky6 H;1 >s@)8+< D Hh&E:Fo3THY"!(>j[#ҎyRl9nJs%?<8OonMF4=[7uٟg֑>Il물V9U{nSOn8ҽܹ&>8ϟyK\)CyݟL{[]xͣsX%vxdmwSZCԝbqhvpge6:l풳ȇ9\zyqIq9QʽFwHu|R]Ejh%!UGUo>so>$ F$ *'RPғ$;$[*yHD[+,|V>ϼ3JjMB F4wvYP<5K1ף'@HML.v ;rĆQm7-ާP"3Ce UfiiY#{PAᾄطgBnsJVupWx0Ef G {foW鬸;$V>=yL>G\ 9K6.K:&dÓh$)v*e;wP`hÈQy)Q&-Go4dpZ bϢjZ*w =']uᓮTBBUR\-:#lSR!z u`cЀAQ-R؄×_*ϖQ8||x`}v3Ѭ{&y/6$pwqW^"hfN Ą¨c"^C arײ_%E)@ʔ G"E^E& օ$ɣq-c.zN9혜vv(x,g+Blυ0Z<A(3TsMbrW"%6#zqXn*LJ jJf~PA;no g02(pG#uJotɫ#FL5|^?u7>MQӠ6*KҲ')-My0qJd?)Fl#@IIU|Y<8l5,R#Xd*fW+DecjH .}RXcWH& '\~8-9r6м4 F9Nyіh[2+Al [Ξ5a]>/ʷ˷>~vzy9>ew;`գs,?= !PSfc.!T)p]NeI/ ]E9eȖ[uim׬'4m$t[-O)-60&r\EzĨۈk$L~ i)-0UC얌'op|X;AµV3=0E2u&d=0;"k"A[hKPzh֘Sr}l8'73cɊ*QcǘQ1`m4ysԨ6}gj/xwwίuW7O<î?7)ߨ'7Gؙ+;*2c1.4b #B%e$[rKD0H|8;e :)7JyO%FeVrv0Tbq(`WcU34 ]b@7@4>ۼ-6KRػ0o!{c$qag[\`V^OlT}-kzjRd ?{5rN ,9}efEI"g IQzy>uI=2J4!Ɯmn 3HjTVB iSNf4lVҜ u[)w|L(5=P J1-޹A^zD{')IsQgz}3Y鵻iO֊T2ӟIl_Hm>~y~X67e"+#mJ|28#&?3H f)Ca+^6\R%Po7ɇs޷aN$^-'JTiyԒʁ/8gUg 5զ`Q4/%ޤbMJsT -;9@m*7ɺrVH)%`:l9{BޭۦuϫrPaE`C;W/ۛ(;W׷[w랼Y}~_boٝϝ.{=iyE p;)}H}C&)} 'W˝F۽#\dIpl#Z@IIU|Y<8l5,R#Xd*fWb.Dn!1C/H%b1]!g>8jlAmYoɱF(ɇi͒Tmk 2+Al [Ξ5a]>/švgۦL;5OVαSgF;CENMw$JLRR-Bu9:qm$$tEo.V>05l (HSV1B6ut<[uiM׬'sGNRt'v7QC;}ͲU)-.W߱B6f}A 첷Kuw=ee魞>М]8l}Zjȭ嗘]؋ *-DM|P)PԧOQV\) #UJLZê:y]_0c&DJD*Šq1G74=m4='n!0LnQJh 3ҝe cs|tL1}< r xoj$Upƀp rh.`fκjbj9Ȱ|+@,l:U!t͚M8U En-cc^9A&)(,]PV3ü!h}AK-9r9JlQ䊭Ii qNb)* O_ NZAmjx'hOs(_m>ˋ޳.!p WyQ E|vF8[x4$sNÜ윆uW:=}@Ԗ[)c{5{cuCi?>O{v޾;b'qC2h߽kvIXf%O-,.M u?;^7{5d} ml/loƴu־v[zdcuz~//Vp}'CGԏǔ/`}ް_\5 zJv-9dZBZO9G%:K-at5.\B&]JP+Fdq:'h^>|?}cWQ~?]z^d-+⬄!IJcK.>zR2ŹCPQ&,Q @,yRBŌlʅ 5:TPjp0-y?ʳ~΋˶ykN:}⪿n]1ߣN£CI%T]X4nse!LƔb[6+op@"Ҭњ#|SEV)ۊЂ> pH6xRņ̀M{h:k_ަI2y~tr'M:T&T 9R Q$H(k>!PM1{ES;{] mPQ]jSRpmڵ2x-j @ .t$osV<^s坬4BV,jX>K챖` جc}E)c!~ X&$ang-2U0Aee': ZI97[ I[2U9f[yi,^R,$D\<X: k\;ZeIY#Ջ6=ov(dA_g=Y~~',gM?l-w~uwMokm_>^^w~_IW~%?{ȍ|JP7n.@X./ h-Kd'9b%K$vK6f$b=j @ኤa8Mg#Key*7UÑUI7}©L*^(}"e-634f2I|>PhVaѣ̬ XձX_.M5)MRNwMnaޅҨl@ӱ ۋa9MsùIdA7RŞ%{ PBS31]-tw ʔ^XܛULӐE4(MޥRUQj m.pyo SMX2zEYW[7Tg}nQ/ڇ_qqdnBX{je婉*:2?3 8k3*/;C b/eӮf6k7F(m-tTpD0KBfg‰kh1OdOABx𨽍2#[ġ芚;*hHsn($G,1[|OYMpJLf:"Yg[ !RxR9Q`q,aO&1D4e7kN$(i&N-AKT(.'OPLRd+W.B_ٍL[8v< '\0I$hqלyx%͎0c'wj[HJ՚KYS=9fxp{P^2^`Ir$&3Z╷$s}q#7 :kb\C:$LPKZe!Bth TGt%QcbBΙ)Ei)JX Fءvm?Ã̶g.P/Te>Yߊ`AK3Ryv5˶lOCwHl38=omT(!Yz*i"LzZt4v" e'iO!e $!g2.Gd*S:I5͌QepULKÍfS^2cp[&eb< sٻTIymb5LMomKו{t(^nzI1utֈGn]7u~ /isoAtRSJn/KW00mebkdzw#.qo.[\C9F|,q? ںeϷtN; Ȗ5 <7oPe#t64~E<ݞ^.L;B%a]:sWrsw } /w3CQpےU5*(!)!(2Xƍ`,uFEEUYXuJ" Ǩi5P`+g?DŽQ?Ƴr֯z8j>/A4Mqciin-Ɔ{ 9 ȍ@%9THoT~j|xj~)S?s|HcnhɳNP )1tNɒcgi2+ьQH=DF0\ydDchr]VTZ\)Jk: 6F atERt:1R 2,X$)i^GK XgPXA*32hś6 7!A>ZWi@EjӋfzʅz#^K%A%#%D7<իYWm6Ov38z$~ݞ> +:طi8[Y&_VR!@5;KV 3!R落1s@}G8Pߪ@О~hŸչB5|`mʯ.jCM3auW%-MK5dEܠtfQh =?=XޔSzL+nRd p2,8iAvO`9$ FKUb qZU産N+k 803"3 U7Jy<@)6w*UTvfUy>?cشBܔ'CZ? v]S&-~:-҃vBHa3s6͐5U;ec. Y8}b>Μ3spT9P269x#qC.-uw+ެSo+:og)3B&٣P])Ja9'ٻ&,R[(Jh]yѕ򂝐L] FZND*Q0/XVxIRiϕ]Ir^^tb7b_h-ehΓb+΃A[K}BFLI%eL\[t`Zf"S>x,E1>| )"5)ZsYщD70d+U`áp '1,sl]ff݇'=bJU>Zt*!ꐒgm9TGB],2aBB2 >IY s[\@M9]FzNEYOiƅPWcP;Sa˳5G`E2c&EԞ₉^X)jŪpL`|~+K8-]5MLa asxezn'wQYS0^j ,c?{\*~Wazs=^q'JΤ2,٧|ij)Rbj#?#OQm*7e?Hfuk0= :X2Tߋ$4I{W]*4~#Ѥ1..v`ZkV 55q <5C\CgfRu}r@cgA pzHŷm~EN>/0`H ͳkĵ_ ty}ˢ}'4Y'yЃ*8<1>u~ǩ|N7;Ȫ:D(/wz'-x/v-rE̖{kmVE] ~ &8 8}\,X,c";g,ou,%DL(6._ů<(V@FU=-ɕKW(۹^K쓣/!x9Fyq~yc|he3Q`^190 /ȕCq5 ;rr ;-@2FHLL*Č,-ljmDާ"HgWe`8c[W]6KS;2`hQ$'oD2|0˘NYsu-X׻q^-wK:к$m3%ko`Yt⁁4j~yTau$mPaV;|N$YEZ 1IKF^.Q!x  s*3M'9䪋ϋ?֤žaK<^Fb:X)ε%=J0@W1ie(˼Kh:XZ k`1{ALc"ƤMS1) L D$rjqèEN4Mt9ǜ)s^P)up 'bXI Xuf9p}+uADǞ̧_#=[_u0nzk=įO! .H/72?X3]ulft‡w}w? ,={rfwcrJ}ɞG?A?TֿIfֻtnvf!dԒ-nwMﳞZ ynyda_37kZv5qh-_5=Qlc\,9\o ˡ5dCl?} LX)Hs@[5c M/-kTuz$4f㻛j;5{v~h7Mp4ѬZ)\e0CF~OF~,Lo G-će\3vQێ|3~d׀R/Qa6l٠7\޼z͸Rȷ 1-?Rw={UŦNuA/_C/;=h 9(栘o[=zkPWV߱Nt1ޫ۩tߠʥͷ۶qUwA7)$+>XV[/kcԅJBf|0ps>6޼x<,\ ;SF ' Flu3(m̙t\*>zxT O l9I7a P4Id(;*!lx[H?O2_΄wVyu6:z mzd_~3٤9XBL)L`YXXϱ`Z|SAZy|>5dMg1*H\C" ;-!& a+2̧ O}>mUs< ]ͬhU?wzUyQ(5ZZΑEFX(.MqǭF_8߅$u2sX!iw|rXBR"*cLqVzW`JBDJ&l-T~궵dsZFc-`}V*AJ'B80tN M?)=j8ê ʙIZ/F޳QBTva/_:ů˭O3آs)p YPe.pm L(r|J2b&rfHݕkgQ B 9OȣHwyuM2^ *T_?NgE)KH" BM%Nʢ'(€V;!N⤭ld6MEYB{K:Z HK$IZ0Xѣ$@`8J2j&G%2-#"]*@iΔW+N ŐI\MJAڭ>M3YuRȱg+ߊ,G5)q&A[HI8YY.A\O J\Їt(hk0f~h&!jk{\45q?;5a]qtvioH$jv*!{Q\Escإnj[I~ܢR@gD`EfɴȂ3% Dː{ Y*̮7! C -9R^R^g1W: 0B?B8u*?)273>K(Ʊ,`],(Na<3ԓI^h="PˠIZpaA&bRByҧ(]Tʶ̕1B*ebR!f6 hai&I= G:*O76쉅JSGIj >]HʡB94&|>2.Xyj$;E| @!aLVʳ#2D!raP}} QLy\fP޺A&lŻWj%_vuFu3l o^u_aɵ+[<[*U̵G:}PZ;pAPdrIh0*h( ŢB!A KM69$R@z2{ 8}j6 7*x˝V_.* KӠ2Ot&= Xv3\L;& 멣hDpk)M*OSCHEJT$lHjx\ K$i/PsM- M$'(!CV8 -3R(hUQc!#26~˸-l&N 9+9bI͎{R,dz_4`qhUFB\HwA EmE^ႍ+TPd.Cmv A p[c.>%($ |iSt.d>+uQҒ VF-seXxh?l8O!&v?|ji'ps}?/.tY5ǷՋex?^1-REDMJZ7qR[xi.sje7F]O2Du߀2WjF{"CO[̮ZukwLqZZ oɬCOq (x,N>_ElUH5y{$ct7}'Ww iOݙxßr?ԯ#_F/T0}-(Gn=s'?m!8ZhR5-j ?-P\-7~#}fi:-'ѿha?Mc'ǕbDmSG,Jq:Z:;Jg_~~kI%4%9RO2Si\LԮ?a,ׯ0uynWL;к$m$srPؕ*BzKR//YYVP8SJ1X"VX0IIo6p?o~˦d],ta:@ u653OxyGǛL6uHizY\_uŭwt#=vhcw*p&&BhXzZ5' !x Gp@-$cs'?S@*1r9;_ד(piuT{˻LKѿK/qS'LdPƶȞ=.v?l_-ōVe&R$M)x"lr#za)xG) ɉEK:&4ʐZ QR { HO%qcᣋV:.ը8T%iSh) 6!)kQ;H-N^wZDxmԯcJ\ L!wV5'}`UA梣 c -,8(޿}Ջ.ogx]bv哃o3|XZ3}LdEY@*ET&ǘ0"Wm4"r%Z!̒4U]#>ƃs)p YP SmT]Q?"G$1P2sÚv˥oO={R_Z]@^xm#YSH] H AU]}8 9E 3HBGe 1Ŧ~rRI!gg:!Ae7/k 筍Y 5ڑWO+_~A_:4jSl!IY[b)Ed\_PO6>@_ujb0@S2P9Tw`C>G,J ]"A"L%deɳQ0 PJ' yyOv:l$s>H#$䩓3`ɓ䙁 2Ch8fy~w{*}lJ㦱Ҹq|RaL9D^_7`CJ&ْ˾x;}{s6vՖ#܃'֤$8mcJJmƳBF\΅M:[=ڹA3 ywcnV'ք4f|Hҭ>r/=[^V æݽ 9۞/o䘎3aGg̱hj#|I\'3=-l)OC}\8sIH> ӒYIQҩx&,1d}gr`ɺgnS~)n7<)Q|\lDG,kq߻={dc.iM;{jzϬUNGo&kc {6hq ꘩t@<cx`#.wP73>(utY]]Pr$xf;>({BP$8 B1>]ʛA5IcX/ J"bec&c@*e0MtŅP4آkIG=i6õ6F>%T]ɜ'uukӼT18м==;&n@>ic菂c6p QUir_ڣ4zՖ:o^_#6Z8ٖUcIAj"+*(9U:YH&@P?$)TR6ujr2+ Jۡ-)oj8#vЃTv3xnv`l949CfdX *ӝXCY$+r'0#*8[oh6OW{&"gFć}-"#b!]9ik6ZA96"uقS>h)tS%gnQ߳v֥O;yO;k{Xs6>DqKu'ԵX'vya}^ܗ :j.bl2wU,R1|nz.{ڒߋ?ѓsRpo  @"!"I!RFAޢF*-6%̙>o<F8+9LR cn#4ɗm &6^kz_߮كE,&}sOe(+|mq:4>] o)/+uoAXA|?lJ<.BpiPѪ]D]Hyh=Zyhu)P}JS mNu'95Vn㈠n&^pbT(Y8Fi(Ud@myQG(哖NsYiƋ"e̕< #A,"4^=l6#Sw;n5xn?_]dz˕t{J<mNV;s.ehׯĪHIHd\t" ^S1*oUVKyJsS' G >IIkPDpΔ,S͆s؉רk|7 KT8sS˽~DNY;޹{3͜ [rwj@h<\9Hu&N6m*HٕچkYn|/d %Y=#OH(g, s4$B1ˆ@mF3TJXi2,*m mLx GAJAk VCGL  tkV92[GSRV3۲xǓΓ4))0Y-wIjuNT䬉PS1['ִֹMq~3_St6K*I;/N8uABFvl=ꏽA5<#BY72K٠U Ɛ#=%>"D6IMmk(!Gʌ:RO($(y_d`1{EWdI֑+GS*s s̷d("C dTvwʑ/G j+ |0 kFjj2%hǥ]} UJ?vv~S? 0?_G}b|U甗/#s#7=٤"~~9)ouj ąIq{ӗI0+IwYxw5EYG:at7uZ}磛[~Kfޣ.rXTS_GMnf,\w}g/snvgJȋ[qY@ uyN-_[W1#'hhG̕SQ|=\#X^?y2nf7z/b#ǯ>j2G-i?[/ʠW6Top-vZ|ڣEo>4yszd07Ξj; f_ں}j#A!DP8I6҅=>C䩋,D"t‡Y',T8>85}w|?%ZK.T,Mp R |1\bG`^%%^}Y⯧ݟja,'1GѣJ"CN1Ծb[lޢR2r@>"vD#ÞÐrᶝra39Be H &0(ڰ[tZ$E{t UR m"^Gn7v~XY[%Q_n0B) {5uתCq CLnϡ$N*Н:NeguH2 د*xMYAyr>MWmUR#[phZ!)}~#U_~zz=`|J[W2f!5id&Ay F'3zj"39|:y܃ٍn]v0vg|໱N7|?w-Ne$t;=܏ Sjfb jgҡfϨC͞EtgaS.@<&mƋP0$r*)v"wR`C "3F-TbB,dAOCm1Ѻ([37[΁ԋ3j9|<|߰H>IC&eԄϸ},bu{ɻ9kmfL‡O}{7kX|sKȝKަKgwa{Da3-IBQhtu4==mxg՘U {7 i$l!Ok{0xJPxk#wCm2oy `c_b=/N7(EN=[V|Co7!_Y}>y:h|mX0C5e[u[x-}hBZH0'e5 #o!H@`8[ Ϳ\ÍGWCuW=y~hQv5>8b-K.|䡌^Lʺ{$ Vvͻv滏7+z|suGQal٠7ޜ޼8탂l*}}"ʟ҇xu3Z4b6E5ٴa%߃fnA1|=܈ ѫXRR D'ˏSHZJ-Y.5l<>|m Wuz1Ƣ$crd6R;les6E]/d&Q9X?Sxtҭ?Rv2-xIeETTv@Ij2-Fs1YU8 ,s[Nli ֜u]D t2c`!7N CQt(q 77cw&9+T#8σhhx(oYewM;}Nfx^y9&B|}gO"so[x]˾k5¦UXo}߈j ~k%<C"RK%Y^qr?~ @ԩ^1׳\^gۻ_)Lֺ0ܰ!h`SB?e㪮;YwTdr =_H͠pA䔸Ar'1*:G˹ww)XW{zi}7RV;@4ݶ_Ea=$)12D!#AjbY2:5 O ˜h',ۣ"J,a[_wmvb ?6bʾ~,_aCͺ,}~,YrxGIEٵInԂ )FVdpd$K`ۚRc=\2Z{6l`݈mNۜUǕv)N/ҚNi;ގq-<L:ھN "`X 5u-,E,b cAP4h%|PH lNeH\ NT/n9lTi]@)^BxGuv֮zkk׼Ϋ3AdoE)0 (br= >E_Jpߣ*NTew w -T2/:j(_BmSzH ," |-"Je A?6`fx\(F!*mж2d6r}M9[Swcgu; I]R+]tY,$EL^rRI At% q\-H%5/4B1+jg͖suz9ّ-X vjNtAO$epD.d\8v]A@He@ *T>(0W2)'r E)KHeIKy&s'deH(0 PJ' qJyϛ[l$rjc=D$I>Bb :)B&+4J / 8{ Q=dL˞|#"]*@iϔ -(ٽ"x]2dD+4IwF&߸^gDhmqfk*_39Vlc[Ve(&%NkSxNFjAx(D.%\Їt(h+0vV$Xpý$:Fݿ4uCز ^3:.DY$ kA8g8{QhymNѳ >hJf;)>R)rZDH%BX![& r> ½VZ(SNHϡ-|$\ H,)9*h4'!\B0;8=G>Z΁+kzNPCf%#DrJ.1zJ{^ <٨Rձ C(h$m ͖\~a66Nb&4`"$7A(FY,BAn 5 w AYN,jsŷea^Pi(+SQm$B!A R( v1 xБ{t.<&r0 YW=j$ vuڑffY͖{FLv~f To>N̘Y n*]@a1C`jdpGyuf2}rE((yvK(`q((j9%|Iaw);ж|&ӷE\$#>s촤RPa҉8z#"WӵZ|i錚֕hCC0Dם^eXr`,y6 ~Ԏq<Glsvb)'/W%?vAd`Vp 嵰˳ OJH;ɧcOǏ*Or 'c$}Gܶ\uLJ|)~kW4j?Ѹꚾ.OW\}v31~%-y 2ضdZЬQ?O/ ֦nl+ېz QoׄPز.(NZr go& !-?nr'(Rυ>7^ǐ?ݎjU<(&5!"3Es0| Op ?K393Sk.7=Ua%H%% !P<0S;`^e|^,cw?ہ߄S9$Q̨#$BR|E#9HL1JJa`tHWkY9A,ad(Jh׈Y]r{fRogǩK }c'/Wǟ.1^dVbo<.lt5>&QJM|Zvô7׷:-.O?^ǿ 6җ.Oz:bͿIuPƂ?oqB_=-"LSy.S}?cݭ'V69I~=R&yG?]ze4Ϋa x]X͈w1:ʫOwuޗ|{1)o2K. whvS#^Ҭd-̧;rUm+;+R uy-.Ztg[)_/ŵO~9aJFQ,!‡~ViZ.}2e; ݧ鿧fcOl}P%~Ducq+܎gEto||k s>-;uk՜WggAG2۾Xkvc}f]_axo=+!M] 2yMRsԃ9<!IdENLE|QJy X= ;{P 9_⟦j _#& \@ %Qe[ F-M {,eS֝zR/R>Q@{6Xxo+uqXČ'scm6S!DFT(]h`^>tr-_K"& 1(b@HWLmlIB* p/tr~6w9 Ilamf=wUhKSWҠI:YS{b:u)>Hݘ/ʙ,K,F5sQMZNii:wA$quo~"ggi2NJ =vhtMsٶ@JrVo1c-m/\PZ1A@EKY(CT@I*:h0:dNR ԂK3'"SMiQ=}Rb,,FL)Ewf]|JXboq"- շp'QByY ؐ@ICiAF.hR-kchEJI _X&;[T H)"4&KU"u!ɸ&T~Σjr zrTmB&6w{{+c:[چrcO1!/z.74n׮.vgdZ?gō 5cm?>|q`wEn]l>7y)_gf JB>zn>z8'HO#;-ui=3? =y|̏FnZb|˫C2yN~+ZLci q zK4oҍfc^)]5F^ȃ)rL6CsymV5PGxܤxr:w2MTCZhBR~4pjNiϘxzuUճ㑞^Xߴ)^L7bmɠr|C 7f'-,(Q |X^<|5_559n}=I W)2#Y67'7ީdݡ>A3fGLo3a~x#v{C7::=A| k8s`́1ߎ1YnoXS2Sl4y姊iI﫽ZJw-Tjk< \~Fڶ<r )Ig.A|6A]-;~X9lx|Ľ#ơ[G%lUx%-˖Q^!h{4&1@"hd6'E^UxgrP鑲hNu(zPxa٧_@935NǼ' |/OħdqVWT̠!l:yYƍ,2z_ oyң|3W~Pj^ў*v Tƚ茇":5v& R|2fXժi۲TW&zBHwouQYb:)0Hd!,7K@NҔL*XVUĀCҟ,]^陔I7BW+zP#EKe;@Ȃh@ם}/u㻫Ѥ )X苻jl&(,MLСE芲 kSV6PKux- !'mE 11lȬk)ӥHk`@Hg5@@ ͹:u+$TOڄ?e=۰z{rT=F^{hGbUa{'1ʩ%xv,tFjݙEKN&k"RNfa^K$ 6Hyr U>'Qz")d$ .E4`Z,d6^JiuQ[-eӢU?"XFSeuީm6&:hS$_21G/PxRLqup-7N1!IP u-:/P/)F%~+wHdve0-כ`MKVT9 LFU.MgH]6cRMTtlPNT ײٺog0L8KQ|$msV۔,5Jta#/ZD]rd]6Wm֞air{cdNut!k BR^eDY&:%2m[-g7<2dJd=r5~^̶[Sگ&m3zu{eP/OTj} ;B"MM@1,, $Ua\Ԁ`"#*)HC8wKݼ3>:1dzbKHSC6@NHY,EOIl$&lgQʁLYSxQB1a"ng͖'|)5el ւKuRÍUp$dr Iz^@"|UT,MYۯB A~cgџR K(BXDFJ#Y0xI6kGH~ZVv`'3AeȖ\cBPDeقGPT"N&뤡@5 h"8N*MI2˶$^Sx7R+٧1 ZQ )ɬ)׬U޹Gvwz\=;kF"2k|{e<Φ$Xq2P(e.Ʌ@!M&;{c[w=hF!ro=:AfU8ܽtUͼO3:.3) HbAƂTTXLw{s=njL]QYuF [%fPfއ}H58|9- gF2Cngt6Եx;S2+׵ؠJijHkTW0©ϫ|QLc,Avƻ]hQS@<{]N.uҝP;ɢuH!B&QZ(QT8նV*mLb&lxb+ڤD(!= ]Zg]5[n_K>|+wsdni1 Ύ* 쨟tDFd=DŽ/ǀ#@nލ.qc^d}R1dL颈IYeTs8 w9F8LyE9FϾu~g#B{/sb3y;9lQA r&4R#Ҙ,UDԅd,AGj p0YiEMt-A6JAdsId)Slcb%ca S*2&l8+R%h娒L WCKl9{u\T»aXGxsR&qC;&7QȆ y'۷9%mpc_ٛ2[2*/K飦RyG(RQJΒbP-8!VU(FHRl:ɤ@6pT2 QA0 EOM SuPJa.Nzr ֒rvK(Mda38nN,ܿ }]unO_\՛xtq<9nqͶ,k\akI]eM`@BQ%"ctW]F ] ?jERB#*'g 6DRs8 rɇ&[btBSͨP 6 R{@[)1mLDGřPSgz8_#b ]$l 8`_b}xL IV߷z8HCDc$E6u.hJVH]"h QDI0I RVtQD A1`&(dKQ hTv]lt2U'`<})_)^(eZ{a(!Iza 2D0D-cD?6NPm)k(ZZm Rpjx[Y@Q! ͝">TKΖsX"nf:'_ggTr\E^.:^x\ "τwPav?m%u[S9ճ[\Wv-wu<A(3TsTJ4|Ir lwuܖ*ڔFcOoλQ;ipt:v 5\6oLuS>RlJ3.#(QN8e%#O]&̊ {ȉ5I΋([Ԭ<&DD>m0 ɭUw? Po'K`жʙHh:kx4No+rQ7aCb t7(ªA3W mQn2ygJ.PBemr[Iq=TmrƜkaMp<.OR^FM_HbBÐI. y. ~2F" P* \ըv|ԖhIڡ̃XIs*vw Dz1Cr@HJV00:%s*i Jk"phPRTk/snm4u*0%OAhô:iCіJ 4J!#2spFY,XT0+Bl8uYېaGJYcN'VJԃJf# QO4ȴsdOBQTSFi`Btud($$6&1y0Yq1׉LH2Qj xt3C(n6b,I*lSfgDZ+ɩ4XEٝC,{YYmbTףAA.&I2O<"CStq|.CUoUZsÄַ#ʊdNcHo#ԫ\٧|c֌"HY$+T$臢-6aՌ[g`4ʬfwa\ufQv f5̋\q һ[aޅyP@ o#w7OkwhA͜'Rgٟ%A NF ]MյVSl~x5N-SiH"較(v/W2NVˍA>?2Sh^U2viUĖ]&r5zh-[|8˄bTMM\B(͇/ܠ菪Zggg@5jⶋE[7Aȥ<&# t4$. څ[W{D 8<,*oH"$qP1G8i>141dEp)&K6I8ϰ2'kVDUDN:Zelvd"L@ A#-R LANĝzV}:/1NՋᱵ|}-5U4ڳ.m鍎5h V M|[yh['ﯶ/j#Yk&sQ,ԣ s*A Z'DԄZark+ɛHQj&=2bvs=j{%M1ЙѓI "G蘦97i'Drw8nC\-Ho V;g@B!uB&Qt&FD&ȠOFu - _'W(tc`IBq :,DH=Yr>[{ts)1QA.!:bKh=$dlQGMi#!@:9D6Or_f޲}ooG.mPJ*J)]4!"Kƕ^ )i#FGyby/CگW&1e TRbK/- =N83"4x j)` s,N>5Q?:X,JhH9pVTfgK10!|,Bw~q imd=ml^[@Hx<4*EK!','2O9k՜〉3VʺV`A#RI;uJTqɌ'H17VTv9u$P|FtQ(E \Q[.JYŜ` j"QtSBjŲ0}|,|8{7>ՋxjE` ? jc/Z_ ӗ),osFJ$M8gp9tAQoH,ɣnXJ& g)$pTkdP!S@rsNt+M^҉03h G|g%I#ńm冸uKqRRƹt[$)}2IPDRBHZcLH{34Oynx6q*@dst8Ƴ{2%fH7~yJrvÇ5`:ch ʟ\x4 ݳ5ƞ*u'+[^ŋl!{.:ulQVدyۿh.h5s &eJ,Lʈ[&!Jgb'wJ '3{[t]qh"W`v#}-5ړyF=KyܭEq4 է..$LunuOv/烾;FX+Z_뚢݀Bmv!f<1EmEzPUDŽux-]rF!:2D!hP^锨G#,Jrk)P(\UY'm|4Nw}Y  :j{$q-<եrʆ)XdT)$W! %uz 3PSg"CiOJxSiD)w"b:#q((%qe Lho^Pcp[*PHV']NƑ^iϺrZ^SW9grl|z_uoh2Awtfr8I\mbUf9ܖy{TsfG[fL]kS'jNŷi^E]l>Bթ)ĉY?.)CQ]jytz|فIWcG- ~Ӭ&đ%,kjm?h'R 3O9ɬy헇=y4`VzO聉2xvӊ4_mW``]aK|a͆}K>Dw[ntVTUt1۪fٟN [J!Ef#ِ&BP(/aҘƄ)D@Iol?HO ބ_6pV= | ⣍=n 瞻9KA,$LY|$bAz,1)rjC\%MvK[wĨ|*: 55ym"N@a̢7^kB+[ޫP:{ݖs@aśeLA÷ɞByE799Uwf28ܯ,&tb,Uk ![qʹjUE%x*%QQW肯D5gl'bzZRZSU",&$m-ax?J.dV#_L $=VtƚlG%qG'K:=G0a8mZȃgN~K7%~jEs`yBW+m7g+%֕& s[]S9MȚLZ8QHZߣ6\_W]3,B|* cM- m> X ,$0ЪK\Z19S{T^+'-W0g5{ugn<[Y;y1;_E<#Fe۔O7Duf o(@y]i߮}|?% {}C3FY.ΟQow#7aЗF-t(& ߚ-BZ1R^2yt{A1V&b1eMahiUk'bO:7y뉠Q"@€|t)*C&f*+Oh zCc֙SΆBPP{ldՐGNqMB d%)U;[-ƘeNe/ʐrM,P(~Qj|¾RFBQL@@R̖}յ0"١bhXIxN gJfm큵L,q5`zZN|zU˲-V-%ƿV<͢`DYonVK.ۙ׳rr뿬3h7?g62K ϲٗgДM6jzwo~7HF$pNK'zKlTg 7ޅٜ6i[ _? Kd޳W鰜ʻvP3@?}./ޞ/ _6 _~"3YDҠ(z;ciyB|T%~)>}z˭IHmWvX?ٖ4q >2?9<\U4>@$2|Z{"➱J5xo֞?O o7,OWp$b]=EbKH٤VVt@ q[ P0 G:K:FTwdD1Ig?NEM 7gC{u|N[^g QOps?Nmv1ck9lmS>r-?vc?\*/(hbD I zm(bvKYӱp:UX"r#Q )9**r\WAlJ)FTeSRTɇ J9CƦ@$9[<\g鶜UvʒdMFcA …GT%cԅ3x(c(˒K<1Ҝv*. WohC RAM*TK3W"EŽ90e_ 5IBIح.& Ae_buK=陑7;oSov7&z*7=ON19);,{c\] CHש<\:%ώGW$-&T0LLZy ֚6^j]+R{>&sJjPf QEVgU12Th3x){v]@t P7~*I\{'^t궍W󁞲wI ݋Z"vk2/wqh={y? WFL49P+v^YJ(һlɧc;]9Q>vw vNɫ9$3筸fbv!51ֵ淛mnL{;1o{mwnh8OKٓe˥sk_8Y|9S~wmHNR$0boH+ю#{$'b%ۑ=ؔ-=@&fî4Ɛ \,V_B !B/lIʶ4jZ?4=t7FC e䏛w0{7>Tߝ%tI`Xy@yp bBV #܃:+ !rDC60ko|N] EFέul;׽ֺg釋A7{;M3t}+JWs"G?^oݒ 5֒cˎ,ih O֚Z֘4j7t95u;}Ij!a$Lsՙ["פC0o8[p8k{zz۲=;m+ٿb$~aYOӳf!y+뇓{ہQp JAa|TrfP,{2đ›vr {1.icKTIs%lDbTYGD,lrq[7 m Dn[gwX9;[Xۣg8?Q^f0b͘ΛtUeȎE< b00U6Ϊ2*9?6-16(otɍ]ţTz>yy:@]$Q.ř۳W 08?0t2@xomThK9(s%xʱ$.#ɑ[wwQ3w"/lMvdEFI]v6pM|߼2wQ_CM'CMog/VW?_|/у DZ-dؼ?.go4g# ia v5G 6o C AI^ܚwԘ)Z>5;SK.SH6U ~aWm)dcuh J2FZ`JyTiRGEfPcU\@j5dl}Veڽ{ ̨37p2 !t"4pr5U$aQkCIXEBZ.!u5B1oe1hRI &ώۻBcy=VkeRY!jY~dH>X 6N e2-}Zx4/-)mQ2A,,yV%Pb9 !*a6!]S6I3zn7nN}5<+2ZPgFbOɓBt*,[lY6 i6%wcJ!N!a0`7g%0sJvnO%>LCÇȚ⫊Ř#YR/.w,ڡRTjK /q_B)Ya͉UwtzG9n_Wy:|!9eX2\5.DLH6訝[Ͱ̛|.6Y _k  %M5XEC ": ͏U'|quʖNɏ¸ÍC=ޔ oIЯ'}?+ZCEuG2z-֥tD< qP4&kO#(H)" z.U MD(Dnqߨ;G8Fj^<up`M6zJ$X C_X$2aCv>$!N0dAE[xS /n)1J>HQ"Ի pKR)D^qψR#"^\ocDj3mBȀuε(t3DS a5 +"^L]c L\^[[WHMAC1@m8wG_VyYbpCf%s\c\⭻Zl>/6KG0iljۛڤ`*!zD!qq/q`C9=v% /X|Zw5r2|?07&'nNNUehfȲT)i*{Ԩ]'mH1CL6"s BtD^Gf+9J%ÿ+MUz>aUgҮ&=l# o @Z>AU["qӴ:nwgmmnt{[;|{:-?\7/ŀdߞQ)Aԫ^Af.J7Rt+~ k}Ko~~z"t! Rkeƛbt !hk5dC46%FV>p@0gZsyt뀠 DOQY1WmWu plbֹ4~x8=~:?YZD֯+w>nM$ \ XY)TVA4Xsah|bHZ:vHQ1}k85%#>uAb 'Щnm8w;:yh5+pkR櫭~vLqaEZwQ3np2e w-#L-# XN-dqm毶$ULF`bksBEqh-QHЭ{YHնR㪵X IPiKM+ (D1`\G AԷ:TSm"a2q r„ZW=Ub) LQQ|nّ^ECc_q]ؕqbVȾZDd-Z)"|oMVQ5KB .#"-ah=L)+1)ZT5[eŕ@wnRY>aiVFg9$gI6ɡKVc1( R ig=1cbR)B̳F;Lut}8F?]qyD9t-9p:pkщP’`+I8Y|@H9RG|A(,@ Ʀ u2C;~iav̳7Mܽ 't~:uaZ47=1lHnӐ=8@Wc3^=۹yv_eli<-򀺖lLF.S/R1r!{y#{# RN݃li+䮛^F?dk-'-와{y}ޞ%wu\ 1|1bz<5#!Ŕ-xE:&ݽv\ۄ@P'~[_a1)wy ]rD+sCn<<8ٻ6%W.~`91ΉA̘&r,!98JN,3;VUWuW}e1Ul9Z q׊Ei1bPDi.l 'ho&7iUce9Z|"ŧ׺]vx ݽju07N1'$77{Pqg Agu^QR%SO¬R\A(|Hc'0 Nc&Ab,6ʌ ;6`y KIw`Pyz~ުg@dۼW-}@km]-',5Z&\/*hw?$NbTRt߅eB)$*ğC8gdea[ J3#!=qGB 0D=6.(W4=NNmu4lY7<(6 ,(Z ð.:YrS̕T}8_g8z0Ẋtȳ>6IfubjWB'`E2jjoXJ㚔|| *b3.\=pmhJ &"ɏIy)pV7@I*Zho#_I> nXs?#yZB {ϱ>o}NO ӝdVdm-S>lH:FK>vV 0Eb.SwaŎn/ S\.w0aӠ 8rSIw 5쮿z>& EX|Zy@$)Ԕr,QNڛ&}\]$hCj_] ɇXWNZw]1PZtJ%A-#:b@;o~=WP|DJLo ]"jr޴ 'xag8^cPI l]^``?#Q5|$Ht`̜>jmVO[_nE=/b+蒌Xx (JTGܐc@ {L-AqVWK߳{:d& g1/i=tu~5ߘ?͍9ƈ %H# QL B1+YAEW3ud}T]1U`)Y}t'sDiۖKtQn0%zv~1^+k)Lk"QKiJĨ6xΨނU|/長JgNgNA?ԉq.;Ѥl< C4EX"0#0[h?`x sZD)Z0la%^mf̳A9ڤrHM9DŽRn&x9 SGEdQ0A[R.k%%RHDcF(9d2TI7uvn1w3xH9Pggܟ>v!+O4R8 L >p t;10 .f`rH8M^U4+^tPӀB1@BqXt P.mZqe֏Z+>`co٦oA]w쬅 `BK 1dN^Ek ֥ii%J:M? &mrOҵz WM?h*.ۅګrY#&1d8}0{} |a\9 xM/Lǹ>[B?`CZwi ""R8:}%ZinOK/X/0jᕓX5 [;DV^ǑpYG ڗK7]Ns{n#']r{0o:І@okKnׁ78C05_,LeQ/.%Vze|^ /By |8j U^41O&c'b9ɚf ,7 ӡ¢Mm(y?bo]gcވw'rTQĶ;Q0s7r:TǤnc%iwi$헺3p}R&9r$]IoeJHJqw*&)ܕILrhv]yAytdZ){ooɡ x:eH`Up;oB!g q)n1gz4‚Sgg#Z)Njߞ}ui=RPKyUV(=Hwsaҫ:ByEҍxTxSt)J(Jԩ_Ō><] iAxTJ:Wh -( b.p9ә$qgIE'"- X5W. SbHax%#I3W AHRVa S띱Vc&yeĠQbIj3Lo \˝QCRNjm  6Yʳ4O_arZvyߡBF:K:3잪]{Փ颬6)uή's(tIS]L:^{ .t9TJsVbiW\};4ki@Q]B[84.}K슨s]Bx%vZƍ: В=L_u['?[ocɠF%m䌟ۂ[ד{Ƞtнْ|FCrUpv%O?S~&nksZ=GiȲ7oΒod =!"}2}"$)Jnpb΂wDðqk3l2|v29aSP]gmb-(EGn`Ǡ_[3 sv5e;_w-331e9.2S<2S2OܣS IwݽvVSRVv.;nۇK<]_;')q%aܛ%N\E:h_:{Pﻥ{g8H0 ,}Ϭ Ku2Le:etYUe|(c^I͒ǑӯجS'NuUr:%-joNǼ=p6û Lp84*^I EP𐣠yZD `${ ?ytsLp*AI= mBЪz8s2-!ZvwPbզUOol ݰR`}43P:w)VIȜw ig㸞[v1F: lp6HF_jRz\TkʛXe3Fg?-d:Uߪh[sUfMI4-ȚVִ5ieM+kZYʚ)4-9Lֺr晜y&gəgr晜y&gyDy&c39L<339L<339L<33fo@P4]浔MF9`j[>ڠ7nd|БA3zA =9g0 A3dH A3TPA 13cf "g Aг =g AгAгA3zA =g Aпw4z3Rc̏Sg򷗄}&3V |MYEO_xU|(m"o^'?o"LLfGuv}f.8?ފv;|\/\E3(,2٤F8ͺ;(7Wz'_=A *c)h)%!ơuѻ^eմ^:;?Ljqp`U}U KAJKC` Sɽ@ii Țw?+Z[^¬JCCެZ'̌b2k R(A ihkdutrfj--+X{ްn4otە`;FAנ6>j-Jp W JmӤkeW= ,h@y $^‡I8 +[O),f/n/ML1VCJ W1ILPdT`'x`ᰡ ^)UGvMw}6z; 9l`p}2UJ?,dV >|[vA!~36/ ˱A25A {\XcV"08J6 ZO$EY m ƜP$hd?Q20">{*5e%d7%96ܶR;o 2r'2Ԃ&ŃdL@(j < E͝Cfe :fNhj[f9RgGtaEc fyW: { M%a뱄 td\;d2)4:%km-HIm Z~KE@H4A]nv'rX`#gBcs(V (RP\Sх} ֪Oy4fh4@;ٻ6,+YF~hլ/2ofAAYL̿Ϲ Q7ME}S.ݺzTRAcc 6o . U֢͠gB%t5 ѨjEK *NZCM-[ S"Y"j$MXCgPX)Q-VR,:f5d ]SWV+= FtT5(cvFNU"eū@4] 6yCE)qG3@Ql~cPZa]}q$o\V6NM$ꊫy<54^J&bj6jk(յ`Q;:0:TC<uΌRr=te!`h`Jss6`6jU|jEBPcޭJ (h-X|.bNqf넾XU#] 2U;tҽv"s&'t6$p Є·v%?KC V\Do{n@B]zU7 ޕ:Նq2M!l'@C=3X aвPFfEHp: 7s"]1Sa*xk&E S ˄Q)wK4(vCHyI!1μKXz Nd@:k@S+Dv2Pƌ[ ԕu*ygd)(`_A!O]! 8FV[Y=u"0 bYT^f5! V!GBY33X)kMȁ 0M6LjXz))@<3#i@W|rMUKT;fe ' njE=*bhmD=!/M@<-b ]^vc or*64Bգ򣷃v `ĤmFZ$\|,x/=/@s]ɒlv5Q mbШ1̀98a0z>|BWEZ0$*db5kUKոޑ0hԖ?.aNRWTIQ [0"1w@:w6^L9 T*R]5 \W#̨v*R"i e@0FOR:j(>`4@ ]zxJ) cЖ}bu@88)S`ȷ<6KF\“<v.4mϗ~{{L=C L›EPK<Ɩ,0zQCB? >(cWw,k!Z oT@΋%a[7~B(/ VzjŒNTN?#_wj>[< 3*|k-Ne`WVN1u8r߉c^|*M;.VN:ZYKBmwx䓹~uJF|,0S1f<1-1Ňnrĝr> zN fӎ.w~KIkgz޾%⁓H?Ml7/ uzrUHl J]QV-w= DBӦ{&'"SZ[T}g^Rg@5| jm9SS=o૭ҵi%6RoEQZg3v^MiNOQ"RTYF)M(oiO9FJQ\U ks -+TIwZBR k5j$I=mlg9s6!LbQFY4Hx)3 &E84rGg"8~S=ŚXJ ɚOo(9͐[  ";RNIm/ls =⛖"_\|P e=GIvT˻v/5h?ɳ{t ӅG^Mk_GpΣJ0_m;anLPNM~~N~w*/[_ߛ`|wcXX}d{Vѝ~ITI\TQ"qE\-jWZ"qE\-jWZ"qE\-jWZ"qE\-jWZ"qE\-jWZ"qE\-+vZmx lN"&B*.]g`@%+S6 ?Al}e퇭̟يez*)>kҊpDDI&W, :p4ݨ@6yrZPhKmx]pg޶b#8HX';-ǏE^>_x Ὗ>H0_ΰ=dQgؾD_9Jv{>Lv7ԹH2Kן=d=U}'z|z{AL{Rphu گ64Q,=/Sy"6El.bs\"6El.bs\"6El.bs\"6El.bs\"6El.bs\"6El.bs\yQBpz6J DO$oD Ovʦ8$_]W-_lٯv{`QAK^ўJ[ZӚt+v/.)nN.ӱ8*vur4z4.^[ f43iQw͔*?9qu]ƌ=2]؃-3nRv|/7?[#99}P"FF=zvmWij3Px3ZR7ꏅ.ú(̯+vydhnHGu?4?^`M~xYO<Ҭgs}V­˷n厜ޑG޼'1ަv_~x]G3VQvJ29j;ghb1̏8co8-;<2Nˌc{a ¯ w~!_lciW-]_lMN_;6U5aZb);ԠəP {-l)YOhLv8,ҌkmE[v*a]SL=gxǮvbJמvO:tՌwltұo7`(u3o@tk}4?Ojz] M~Nr6vnr }w}a l8i-( *6/e8a8%pGUFz{ٻnM5>M)ۘ_?y~wѾW'7K|*K$GƟ&ry[?f58+gg?jl=4ޕ,B)# 1c*"e 俧j8$E7mX왮Z@x|?8e/ pٰ0͊]B52͊'Jw2lViF fpZ3s`szc C,8pÔ^^-R~F G@ !A).R'T[,[('9Ju ͣ`h(,XYrl2.)_?Zid׏ |z߂2I*ue=.('rs67G?v3NyI̯-~SL Ӑ4vLXLQMھ\8Y.7~{?'2eNsٿ@%6[$'vDe*}|d 4c﫷o@7Jl!53?k`7Ve=/螀>&3[ `1cMB*B0]avˈZKEM֩]D_hQdxK8P]QQ@1G.|fib$$,1[B}' ݴD gu(<;ҝ3U:h@5`-{bꕟ/r:O7nOߞ 4'aBsͼF& &>NH_?9#Kc]PP Ћs*cUpZjzVp!75Fs=W^4`RFOr&&&?6F4M^ȹN;Rt + ř'p߲ R`A)2H=ͻNfqY1Q8 hC aFapLKW(<5hwƥ,`q |\igYg=BU e TGLp9-O9llQG Mi,CHXCA C\'qǽ mRLu]1gSY8y|Ʒn64RMK`K Xg0']G$56<:8s%'.4@wf#CCp؁Q)ya!V(u() ) om+JN洄]Ϊ >s9x[:%{tq;w¢7dڧ+: : }2t)[_m^߿(e)8(9x\(a~KV >8GlHFe,8cxwCS1em61oiZ pUUl,ʱί]+~<7|՗HfBUjAU:ȼd ʼEw~f~7Qkiu*-`b׬-7o!|X #x=86R.)xó&ƀ!$:)T D)5OW_W&FkDNq3UW)GV<aS`+No-Gbݧqe~ɾ Cl0{w6B'$,[?UYs,(9K!d >O,g)J1_},ą3`R1 0'GfLŝ&b[qP}w4殸_p%B+;lֲaWTB%~YhҸ<4n`( XQK'U 3`R G|g VkReneI'+t~НKIn]fY !#zj1!35|14ᙽ;_WZl0z[tONhzz)C$: 7'3aT][?ۛ߇aG$2}[MB{_B(|n;n3#iŢaG\(_(j;|BQ*I?>2jd87zR0:e5Su>kQO !;6lȮy[Y ?sb?ّIË\Gup1hLӉt %DL)e@p+Dp}R𠢷X/VI)Cű=~̮TmՄ ߻߻߻߻+h #Tw{Ww{Ww{Ww{Ww{Ww{ JOWG21lҞ5&ʺK/U%zp4sgUc] kp9&DfJ˚,G5BZzG@'+eYw…/Aatپ(5VVuP̜ ih(:{y<^]|PwPIhȍ6^zd&yLE]?zKXp\+ L I! p&McΙJy4$hcZe[ `VwynӸ'=k t,7kׯ*;@Ww!ǬTq͘ ^Jt_!Ya2FHChJXbZ~J+*Ӟ$BV71p罥B$2LQRJc+YŖsIJvs8{bao<ӷa-MQny2q76-g;' jvs-|ԣi;_t~E\dR4yrvO> ,BF ?nfwOs83rA~0͜Zv}3KWF=nyks{&ld?Yr;Z.Isq(]?o:朡LA-x0[/dKmcpM&=c_EZYO .7B LR'+l2Ӄ2pp(d-ڽ"f?T{P;J-]*4l \BڶCz;\|T)wMRn^%qq1#N2"b Oԅl\}q Y]86w }qA'^M~0SmK? mm{#=>;iAt K߆^P [BhP{`\XHzW}(N@H_Ɏkz XE$ lyFA0hO8 Yg-_}&6DZKP1 t '@Dl` ӂ|BT:PI)@!$Dwpj1@QxY.!dc4U˜!t+SA2he* LjZa:vR{݇x%ɢ)sQLvLX`8tف DEUpWuxW&:%K-!dC όk2LZwVS&&)1@IJ5r[9y~I+F"2*|ej0< 4B͖a%S2*RxBM"H+{c%[l(F!r.Wt>OP7 Cv2LWw\v@N 52&paΙٻ6r$W|U| 0n ̇ne*:Ȳ#y+Jl+xh[N:HlERTXm ifHP AšzOUgZhs}|9:Z:2iʤ%g#MV%CU?MW:B'TFӟƊfڄe*i:tLT'0л4DJl3He5RY)U5R6Ds` j]JUPj2$m 3}T &&;+jW"bTY3I0Z{?6۳b+[ :-F7X U|co٪1<2< t~ >FDLPꁙRPO.یrçwS:TPGaH*@u3ɬ:?Z0:uk,Qg6"PVCVb3yg* %, k2iUq$#j'&{`Zt ч=I} \=B 6)ju\%*&CvK[kŨ|4*:55"N˜EI !H-eU(pPLr0o7X*yxJJ}[έJ8#1z:gE*^fYX15`ZqʹeUE%E]WF,J9j]欂MUFLO۰jiIkm vt$㹷mmݻ oĞ5dcWtW' i:=8=\~[+ָ*l-EM3D (AtTvڨ;/u({ f60;$o"* #]-\URGn.bI}vlhG{;M%d%QuZxJ$Ѻ.R)A2*UW{XIAQ"dA%.^l *ΰ ʼnZ7R&ATHP>(m8ePjƽrQljEYYϴ>ct梬ҞJ  l0$mhУltl[d% ;v zH^c1`gm8[ďoY:]\slS.vQvq7f| _n(MBAqD.,! @|ŽY%=4`n!Za-}z23.u\Q,P7V?>QW?.]Å`^XM˦!ŜdCLG%2~w|zG߃;^Wof^nb~۽kjph)d:%V XF fV%L.ZvGxvBFY qx3l2I$O'dz?략sdjī >lwٺ""m?4_l^2{ӣs\xmuܻt.8x77>6=Nf'r7c0b9Q^kbwԻEOE.O1R맏eƙ]ztޣj^  b1eUZe6 ޻M~6'ҭIN0j] bf Z@PML ;wp5h2X(G[FP|4PMt&?TRNFRUs>pgSNǟ>^'::;|%՚ghq'3eQ( !A%lS٫hi.sŅ!>AI#sq ꚞ-5y͒~)%}u3gIxSuw1v}+؜k`V1jNa@y=dtq*%0߃Fqz8G4Y ;"HH gK$]J#\6T }\.`\kEʃCAñX2eGllDxRTԻPpXn>y.ՎiLWR%[m"1̢FFm8匍Z5xg3+_uyclm9iNxox,1a mk-C՜D؁mVVixF$NGew`PpEj]x(ZKJJ[/1r9뤵HƧَ$<+?k=BI&`' )UC& I61Te_M-%W5hb62u򓊥G k;#( B&]U|})Zd<҂!NVQK'>~3e8E˹O8<$EڤlkON7;Lv38\f(VA?NҌ&lrQB4e3[o,_uq^o'˼WI7e~b9m[sN$2&U3]ׯo ZKӺZ_5x)zp:牝6׍ `@m[6,AK2۞҉GcJ3۶a[`P>j^#06uwP5J.(7r$Q $Ii`g&)PE1ƫ%UQ<“{XjWDڑu]-j ea[YM$n H`OɞZ1%Wd5ORsnt[Eru\1s=̊bX7@sorf(i@AcBǵv"v~ϱ|k8j'%%W}A:*]!"b́wolElzi{7t\؟ UXB~ N#Q2^skt򍳄c߾ bsDR`F%)*Ss9T+b\CoҒnZWSYRuMl >ě/\H[pTkHUGFS0`x2j?O4&꫇ mTEP %<̕H'Lu}K}#鸋a̘{Ĝ+~c!/b=8у|u)-CC=R^kr{L ;esn4 $5XcZ T0h.EALFyF\)"Tv5,WI . 3"+ghU1,)ڬ@CM_k8wheUT9vĥ ~v뱮y5Ecnwǣ~cw¢/nِ68AGAh̓+wE:q!Ixy8GhR&ֆ$?ڷZ!ր_ \El^ul/z*]960'=6Yw뾤i{SV ./:UUKe yTD`è]w޾ݕf}7ˠaG˧<^\AWkEb_v`Mʕ㫍[ ˚[ yl;G'A|y˛=|d ]e9wjLW?^Xr#[fsp p1iljӢ9v1&0Ī:VX9iAE65Z-t=>>6Ǟ~}Mrex}L"7WEYȔ@ Z?'' +y]s`ޟ}}: (*ʤD+g_.PmKUL94rN 2̊Vj[Tj$: _>=`em_ۘJ]]e&e=|˓͓WgSZ̓q4gS9Z+*9r2 ga߮_/w㏳UmP]w}% ]\y6}È~:F\|]ye 璣QӠLJOk,S>GY[k!،X 8;wulk^Tqƚy{#PחW[zX+DI6.` % B4CkPH,:]XFJ#ꃵof*'=Qb[ɵ>~llLJ+ޭehw?B'zVi߿TZfNL*Ŷ`,-!Ҳ0zn,ٓRG8n/Jqϊ+\ UTkm#WEtž`_6XcD<-.H(Iz0h4U.*&@Dam,5*1Q + pRI(YS>yD49@kYbpQB,)֨Kuf1ߟ[gMq V{vܾϮQ=W}bO笮6{b.<|ۻU?g#/E]6Pw+:>u_^2\*;̣;ݭGbybݘnO#;mu3? >=dx{#C9>BτC+!NO ˯OT7D;Ϩв-1z"@PBI( *`H"PL2eC䤹4wt5-igP֠KbybShkJ^1FR5,iHm~%*'A 21k&Azrʲ&R"XϚ-gz1ٙIT#Z3\^AD{%Յ2o)f2x%PSL=@u|UԭUäMUϟE7@=Xs.;! O).#R{E%АY( &Վj|&;iO DS( z/}RV d{yd -l,h(ygMcTad⬰T,2,u7wQ xV r1J?5#3gIr581GpO]=5s6Vm#Fdž59 FmB Fj`>+d5"RNEԣzخOv_ kmxH٥>ybw\z0('5e0 61 6QJp/oM^ub@cQ>S[kuH 䔺"IvZ 3ޣ c,"fhO[uΑV"C7wFY-o<ɘs[ȥ)F=@:"E%rZB@+<*:$$pH2&RCzz@]˷_h{Ra='՝h-f+Ww H=%͞6?w_^8FBξNg?G֘};_؎yU`ҠR dP!$[CfX(dnDw>Ll 6V`!UzvQ)F9(w[TrjkEZ>PB&S'(3d҉Iel%9scEm$U2N7x8frrnN׸!%c>r c] 2+ '3B[>P?JEh~ ؋(JIڣud5:턯}>`K":Z-4bcAfq.jƨ&ԞQ'%bW[C!hcS,m8os(ɈZh/"4[ A9$$edĄ̨Df\l(l974+0[ǙqBa sI{:ea8! ɐ'Y !@ӶD֠l B]BǺUq&a^% NQ-\NE:˔ucDlӈNm1:Iɹq&\pwXCQYOUҽ€L`2*kRd)`=%taŋh~ZIǹx(ᙽ4/a'>Y{䠚;?Ca} )=ˀho%]\,smrBEMu3.;MU,Rc<)Bvg+d\6B%V[$CϬ@  sfm#U図y^3O\~) O퉉h1kլ>v7޺u UjF?_Vz&"5ңO^Y|vn ڋ{{W=!>[yk(M9 zR >HhzTՎU6j*Hٕ}s > ,~jn}%0VQQI9| 2Z@R2(sbGZ?lyJq91#mxX U!kFJHZB`l9ssvh:bm9s{ky6c|MJ90]2 "Vl&M҆stmO~JYԩ:`Ŷ̓pStsFNgW6֊8x@2Uk2bqߪ@UK"mGϔu{Im6h51cFEBFDL&|\I_kk,|k"u. BrEf01>o_ދ׾Y n%3϶Uq WwY@嫸\ҧ4ZxOtEmmH \r`6P0?{WFcecIl1g.o,SjR'͔4(n2_./b+*ۗktϦu8߻,d6΅M|.GڿoS>c<͚`1j)Mˌ?&yk=9s="z w,Msdžm|$c+z'pwK_n&[fWb8(ͭ}/49ئv7T0g q`IwھGHɺ0*-$J>G_"Yh҅`ONJץ5pjp꨸;%z<6n%_fP9c~ql~uSܩ5d]+"Ĺkȝk˱O\o;o=h^bXe 8.CBBH>䋤)H[^/2;_[SNP')I Oi(%*'kkѵ%*=* Oo_1{W\V(!텰6y>JB߻kp6o*mMx5Ǭ<&嚋4D *(F8w>VVs(˴֭;|\@i *Itt,#FsTV⊣:1+ÿ.M& c pg80w AN2o0 cn{Mk !vE!Ud7i46ئO %M";ENgjJGRڣeYpJ*͢X!sŸMWPs6 / ҘYdcηWӵY}udߧy6,li_4k޹92::rѓaਢMu8QI8X)` G~5pdJ*U* &]+4UL^{ԑҵF`[s^"W)*"!WAhiƁʚ" !I֊bMm9XO]VmD_'sذ6΋rP_ ژf>m^@>U)y.J_fY ,5RfW]QjXEy-.94˻; ;X|6H1x- }l\ȿ1V3F?~2;ͮKy&h ^ӳ)f_qH%is,n#jޤFԼIQQQZm&Xk .˜Z!KLut*b-hA:CHƓӈ' ʢTFE,|$:n9>gb p,„#i0\۞*Ⱦ =e2ӻ1Uq_yz`NѸjN՟ZtړhƠ yV5!0]JoRM`)6)mQƪ!L>UeV5J2HJY@붜g=?⮺կWv{|}An= y6c_WtTh6;z.u:|Mg"`ɥ(ҐbR@3J؛T)jOo"wr 61khLj%AZ&+e{QG9F  KU?w'ErW6:ww͆w>ɻG׶od[;!ݻ2[3{׫_|Dnk蹠σmyv>OwxlO]׌$Ց/Nߘ.Ty>0hhR?1[W nBH:ƨZAi r`!Y%*3ؙR Y,RWȵһgqѫ$r9~Iřv9\<)qOO^ ىr\OIaU((AY R&&'@GLٖr VKrS15S15S151'ej8Ν35S15SqsAJp͉nPL -7kg{25S15S15S15S15S15{'{15˘A>r8pL pL pL pL pL w. alqro&uEf8E O\Ak!ZjĒ{3}]QAZz#Pَr~{(@]ga-_e v J>Z/7[FZBZaΆZPd&`t@]+xP%Y!TFFbgPKz疮kJ^ fmrY%\1ɪ9Kӈ!i;VR&e; ~_?߷lׇ'm3Ի >W]aBWr dLmrB !#!1D;c%QK&iEXsCvaJ)ܹA5j */|2ZHJ) PBQJp( Qʈ)W&YT6ݖsd&ѥSO ;rOnD~29ڎO1!-E<ޗ;-`{-TfL܅q>gыܻ}TӇ؀۬=-~26tF4z~mzzفIcvݦOs>!h䰔x_cɛ O7Fvp||̓t\-'?0 Ycw`-_ =`G eSc^/A0AeTwdCͶFKKu9ih%R;z Hݑ@JYUDYw(ShCg=CCSgw/pUs";F)r)²퍛q(H9YQOJ(5K=6GWDE52eƟ%IG1 8_ޢL,W[/۳wAYSt:Ÿgӕ%i Z<«訢#דјr| pe)vw˫)~Zv 81<_nR. ?x! _ 2 VH`R@1תPX/9z jQRR3 A~8۞ؼװz{yN ?N,4Q8FJ FrxD*{Ј `֪Rp5G$00_~l5˫Ɠn2b[/5/ʨ??ofwąU=̵y뙣uuCNi|ȉ7CNjA;ײ]wԎ~'\RTEc.FpEhRFZMjE+.lsb`1Y_!UA)*I@h@?7s͆^[FX[Z&l̼Uz4_yW/_{: K^$ڦs"_U JP6ꐂ >gK!5XP7 f=PT;'Z@JЩk3ƇS!JU+Y 25 Q`Nie$Uzh͸Z1|uڒ |;R켵jgݖs1ىuhAmEBKGD=VReʴkQfB""HWEۚHed_ߊ/ vO?gI5EZ\]bk/A0U8YLնx &9ZV~c?;T5-!DRB8U0IZ0X E;Űb8F&'f|#<5.#!PzfXjo}@be ZU)&!jw ~5ǻi0}:]e$Zɪ M.Slbg(69LT>ӌ7dw=BqFTU E30dBԠ`Aq!KuRpp#̒̍b?3iwl .qvTeBC N&h & p6b |:ȒG3bGlKQڶHl&Uů&∊蚈lQIl{T+g2=ZaE>}(pɠ0*C]§.%)9ji|hV$uܫ-ãP 餰]Ēk=h9$gUI8Iy+5 W/ivcL t;)^)#>%&&"(.!_T'K l1 kTdyg ()*eYV"{j8֜INfUf K?b?dB&nB&L~}^ۏo |0;c|-B5UޞfAŢd<*} xey:Q`ވ+%R`E]I x0MOݰƥH>',]46RC[2UIK/lŖ7er߳+fa_pt{r<|[lvd[c VVBA&fTD%RD-u0cԐxh%.ZNNQr};T*eM\Dt8RPntb}6 lY`x10.6)sk266"؇CHI"\+:ώ=dFc;5)hUV-%(>CbD%";7 ֜wׅxa<8|i'amϤr.hdceQ!]4`*$uIFdoi 6 cݪv&Qgme+WځwRG"֜u`8_SRs0).ʁlve G(VH` ue#ͤš"G3tkpO>G0a[x\`S͝_4>q$LُVTKupXʃthӹ;?Ad-!FMґ$ ,A,tIR@Q\Ui Z9m΅g%@(V PYʵ~$,"hx`q,؝Ϧ.hN_zEZduw^ʢP&O >D4D!ZP&! 5A+T (C})ٗ}0I>~)6o>JBKm6k8/Ӫ؆Ii:=~o?叵_LgOU{y;_%gd;. >N|MGq3}}m_ٛ׉%l`#Y+cpK&A^2|E-Hx ?b 0@nY>z_2z1C-|NAѳ06 d5_#C{|9#g1zOǫ}[s|~ r=5rE~U PH;P{qoL썉17&ޘ`cboۘ؛5&ޘ{cboL썉17?17&ޘ{Ӷޘ{cboL썉1]M (ҍjL썉17&ޘHA67&ޘ{cboL썉A Cr17٘{cboL썉17&ޘ{c'jL썉17&ޘ{cboLj7&ޘ{17&bcboL썉17&ޘ_L#Eҍ;6&ޘ{cboL썉1I&6?b5Ygִ'b:Ztr630Pv8낤M$tRrSeقDT{4yR[Ć} }MYo(.*ݳ!*d0i1[pE)Ұ%@f+gGǿS4)xPUj76NW o$ܟ? H4ZQU;Oݨ.꾛+zY\1\2z7fh{eg/eo >~'U4?1fωXVⴐ;n?F3Rc[PRH\8 !<|sqSlG1{vLch ZO :b bt}=ۯU)d?iߍ~Y4eV*>羷].o<(2 gi?w AGoD~8:Su%C9Y*P}5{暴ۣr?gL)aa\s!0m2vP뜉?w&[)V'Qq|ĩd%]XI2$RAW0 & Dtڅ\r.HK7Pb j ]J ح}hL@-|~S{EgiW^> Θ5yntm }6Mv ]]U׫JܙT熏\h@(Z4JUGeQy%@ .JEQ$+9#o5kh IP3iůb cudRah¡C[XE>:>|[uzγpcE]_j3#$գG?~ \2tPR(ǯ2ыG R*KS4 13.`Dl&[s TPF Yml*K>dquUqrϞՇ?U_EսOYДēvuUb%jA#2 :- bSAw޾ef՛iSwm ;87P /|o$͙yXiUt;nL xl4!;CI#ONy8H!9uÔJr ^-FcJ8o"-dl(jECMtcdYB-]!]n%x[ pd-{>;j;=5B7c1uu >ĿgO˔cjdt@T2y>DW,tN)-xV݃5g߉ܗ7w\$w.û}%~E,ɳ4VhxEUy<.yļLxv ٥^.:Qw`\45oE^/3O^8H6 =8iLd7AQ@# SmTNV @>'Q 5k6Q($DEr (Ckvu R}Zvwsv*<_,Q3= 6LٗsX%<$Ӎ~2 ^L^W5Ҧy VЗBteԟLXd,kNHůꌓ:SyQHI?/`w%͍0|n,m"^0Û> DKJ#JDqɦ@]aw@嗨\* 9l]rsuz$|b¨_ fץVՠWUk'YoiqY?]lZXnfk;'Q緿%lSZB6cTJH*WiPݕn}h&HWif׈wbVb'N,!XrR5;WΞwgr!;#bW]p佊h^i[l `˦xeGJI tp_a|Bmj>꣺ctT潗7_Vkۥoq9-z\/yYfOHpVf \{j!џiǻVi6P55"j5GC[+_،<)dd-KSZ(!{8.5(]HB3\BedE{LD'@ǂ d3}n!/#K#iӫD dQ(Kv:vcw0lg67(r2k}$HۻNa?VL{ۡ=f&{E{s9fcK)CVvFWv a>ZύĴމi9gw)Y1XsO\+{I&xJڤ ˼ ck[LӥHEgAHkt'Նk=f I91p=^wz`zXR#,Dއi% XQ?[͉c!ہѶ.. X! @Q;ʡ i`|">' 2fam,#Z8$EJƥQ$(/~~N!OыܻUJ}ۚ:V":W?T7nƣ;:[?=?1;旸iȎGvki=~}O^Q=#f_j 6W;HgFK[LxR|nίi+zlrc2/|*%8`v=7վx՟mޖ9I1KkK'OgԙyUof0lD-.?d{g(R2%`*(e!)h hvIY  a@yI7=T z4ÊpaF?m_H|Uge3n?zٗMY}e_Qmr&@9$mm'(CB"`0.,康CЦH0 gA*O3x&` hC6}IWAl q>RXJ%MjZMçzuO\2OG _} DTևWTO._6-^>,`a QJx/D GrBzjk]d$؇\PLd- \^} BτSj;vI{Ԯ9cw= 0f[ :jQEЁԺrCs؏0Sכ7hE+FTō0 z!q9K!c<邍A0 HRBBJk)3JitQ!&ὟRΑ0t}YNGN똳& )lsp Z)"+".&ݲ^@S~仾 L!W^M6; Z]r碃t](.v%DR><6cx}#+С6#N\A=dj$'2eJbֆbL/j1 ))ʬhu!2T:[L)M`ޛw@R6'ZASĮ.>:-ٻ)6͖sV  ܢIY&ۗ+ d*Wog[ds5_z_txM%s!jf;4xҳY՘)Yؕ)9XTct@ 3[Im}%jAHdj!*(O^[Vb<_a9o-1zl9GYWC?#5 tlS,GؤKkOYDWE5M2^N*TQ G;?CJ{'m@ cZy/KmE'AaH8 i= EDQt;O*S")TehWD"3drl:C &+!2f&1f2qRX*ciU0>S+RaLZkG 6!r³$B〘DvwC]}5S6VM#ԃ2 kR&,$C Fd>+JI?Jf9G;ؒm{1hG3 9U;\cp?!ݲݫxHf>ybt0('5e0ScV+l6%_/ ~ėiyn|7$no-%BvF&Į(Rh t{4]NOBom$rS(+YES.Or0?M]ZPI^$"Q@!d+|9A*IA2VX%޳ zIyiV-5[W-=}9_ pU+hc9Xx:{YcvOh~̯|a3sqK7C| b[ڡ]I/ў5eJ'J>HA7TynHu\\ɧb&d8cjIa n[)Ԏ(Ԏ"`r((7 dk(j-'^|\Xն6F*ML:D]q!@ D6I$Q{?Y'til9O G=4{Z}<+l{:T8%Aj|N>&Nӂ0;W^&yz⼯z=ro:TG]Z.p |'jlSΓr&H_kDU%TH;'V.(E 7[NG) ^*2ԘSD *"%X0A[Cf9^SS#9ORx)ޡ } B=EdٲT&lkE j`SLJ6{dtNXQ-`/7Un2E6frrjN׸!%ۛoЎU@e/?g .7=PRѩ)*Q E(&Wl} x8v:Q@0waoD&e"MzWr.›F碥b [޸\X6; CWa}ApS;E?h؛6_.̉V6)v(leS CgHQ͡3G7;)ю;7ь(ь"1!`!"eTɸ|Ci4f뵳>߱`WO*>z$1KB ZɌ4"eKkj9 xbU߯7m~eR8ᤸ/s6)nݎoҳa}šKKy(6%_5Ono燩ohd[hI)MRI[mW9_mdm"j4{}ߛ#nUYkVZuna/ї9ƌc$S^ۂK&W>~hpW۾Y2~sEmBH8mU|\Lp%gW)g~2Oaps6ҙ YM Ә#4P'%LQfW VM7anʦtܼ:O$~&leϋ\^Mڻ۷;Ef::BLh t0jsrYKNYB{1@e75@5icܿH> O3,}p)"0m!5):<,Ѿ%m'4L~ĊdK*ӎw +.3dɑg!ePiAdz>Ґt S]r+8$l32$@O\≎-`sH SNܩ'Uխ3x%UT}h6\߿A7tcnu87e퍎έ A&J$@)$ +K;$rw|w煽jC{Q٨s2AvƝR)T8V@ ;_izAq:u ,g d A-䠔"0$O zT31dy'1cdγR"QGB)[;nzwg#ў' KҏQ&FkiR3DL*%8hf ށǺYY M xMsھ>'6[V gT *0#fp%,oƂƈuJ鿆LH xz*wk3w.Lu=cr>Z 0psbEhi=70VZ:r% ;= ūgBC$dfpIk-K,D&ds&mםo~uN)was' 7pv?1z<ё~Yd?eV8}o+[z8VtLHkh΢o}<8¬$3&HÙ/t:Up"|jp$ r\g%F"k1'#%#zaЙlC2{%SFUWJbf'S wдUZ  \+L4LǮ w6txtZZhzY|6F3ذRp"{?=5#$eY'T>9qFګX,'6EDB*% ,]װwɠYg|lpjǞ=~ɾmcjJ:|R+4JVwB0^J1Kt>GM<NۈЭ&:sK}΍s{:=xG*DfZ!t4Ӧʑu][q+hV sPaM1/nWTy,K=~\QjZФ7j0ԗXj32)g >eW5"Sߗ'/K8 <8cX\K s2jb !{QVRjX: 7F)x3!ɼ/#:iB}9VK}uyU2Tnv]v"(hQ&نLVEkQ#9ѹh*-3^jh\IKzٻNvxoYߣJrUàI;;uߦ=&߮dmK=l Bv}[})%}]qr3یP cdN#I ;˔9`JF#2u!5 {l_w&0+0hCbZ=3&ɺi;NˬRFygSnJ) 5ӕ[ >ڑC,s~ԯ--qKbem./) t|`T΀rޔڕ# L8~z2ׂ͋۶/Ïїa Ew@buN~ :p+Z|/A{k?l+z&q4Ev2eͲ/5WQi\ԥ3*О7UnTd :& -B)g*zU`x#呬$1ymfW)8Rb$!霹VY&Pqs`eNkuK+Ydgҝ \|C?J`0͝O-mpWLPYf濚|K-Ete#4Fii16.(fp5D %әH60&bv")`׎Č9b!+fXYb4s#0b ːqRr6sBKސ5ug9&quB 8eN7:-~ؼnph\;! x<&7)~j`l,1)|xףϰ7t9(~xs u#Lh7CwuCg_Z7ֻNtj;cgs!leû5zm%=絖õ|wṫv2jFe'Ğ&#Fh7b3{eԿEKp>s3?^dp#AE)bLȎRJlE]Wƕ,QW6aIK#gr)YhSƔvuhJoܑ>rS:%wPfcs\G ;H'$n#pKY hCZ/[[&H"WQtx\"o2Eus`򲋯n)U|P|^r}}9Qu2׆Głe|DYH ?Dƒ!˔0*'Ή$ Nؽx}9:3.}jϬd -iC$38.K`Le@/jNgL6st|Lpz}H/Id F 7P^KL ˮD!.6i-.0 /ޜ&3?(-}z#bܧ]`;fE]f  ,&,1H^Iތ~Ŗ8e+:V&YfbV6p,q[^*ro\xx@=#͆gj={Tmu17ƇVFRg%֋V;4^ 6&wj\?\NGubqջ2xk bvWX/yuS&~hժoklz"0=8Jn̹D=/?a-Ӎ} {~ϐA3޼ްt/ğ/C+B =|/<6!*|nH@ dC,n;9χWS+ 3Č11cL3Č11cL3Č11cL3Č11cL3Č1m!["o!䦣z8hKD ?fZ)ofZ 9WGe^ =ɏSԦfc汉oA ؠ6%SK]:Į T#dv:t59*Ɣ TYL77 t͆L佳&G;iy/Uf'&z)3_G^}=k%==6'9ˇjNq7AU2vqRTCjb]Atфl %&(g9)zn{rQig[9QM"OaELsNu!2Bɉ%.9zJEUh  lM[u꠩xQ{Dͫ%TԚXxr.b0tԳQuJjvd6ԯKUF Y붷?eq)e3Q%SKJeVS&QI?519W".$E%jB]FEx`[c`V?׳)N⤝_jT!6:-Qc`#4(S"`uL3`;.㤖QK{blϥx ?j gf|jbp<&CjavPE"I-@4i[zYVK"/zu*MDU>Y\ e%x 61O$ч׷?H$!O%|q@̖3LL$qQxeHLr%Է֪jF ![g 3Ikc Djr^Qݦs@1خsX ,.qysR%%F&s#cKJ͸>c1{ zlvC>كqilڊYVh!ZLR!䪵 G}x켪H4Y;U$fQqWW暒" WZLO۰EsKyp,LJT؁3bbz[ny2*la7֛:Mf˻6놌>] YloNl0\s-5Jh4ds@AB3B4s%Lp^Wv:TЂZizlC:)jRCj.UeO:zMisb&ZjjOQwLIpIt\R O%%.Ū$#b%HF%F]/gG -DȂ %Φx5\TvhNRl| RYc"ݦ=m#fO"#-ݢX"8 IT <0"`V(Dj)"ؾ D%6. 61^tٙ)W\#]n+0j&zStyIjK1rJnRr]v&8G 2ϦRiN&c1meW\5G؇.>]<^q=ԝ|$`žp>;' :n(j~$ѕ\ &qUlZbHq Pk'?;>S;zǣI;攪"p}O֛=Q,%gSjjla)H $CV!icY, cD+U]Zvн4˧TxdW-fs?9x^w/4q~my\Muz:㞝-UPHuǫxٮ*q^['|̃N{: jHyW^' <@iY.Oh?;;@D>aRjQ5U(sU: e(h+*QT> up:.'+'3v΁w)j ]>[$evoe9'Xn"6Gh49UR^4PR5L1K1hD#qF]ڶ[T(-S}̟MT:D8xe]_(v v]شUbODPlH Z5EP> ]!JAq=k'Ł`hM2b)GX4.ɲP)Kڢ`@Y]/kKX x+LPp(V!R2E%rPܻĨt ,w(Kt8Z>1^""TNl1u!QT*m 1;UFldȺ| vm9i׋EA8kޚѓ}R@y*(PM<“ܗ(PmیİolRO']4' \@íkcV$BfKIk/vMǟ =Rͥ iAWYٻr#3;XZ94eӪ1 [o8r/CFNdnm6"b{7l[l}yw#{w1]_x^9f˾\CXZEm- *g6xT](|s4Jت帔FWWՅwƀɯjUާͫZ|WecS,Q\s4UPZ0 Et6j7]{ܾ̿_Rz|ޜo,2Gak/Ux}]xQYVM8VnYR~Wz$g&vڻn4&j/eSb_ߑf9}h'k۶Z`R~&5)}@`mM f[3@'Q|Ee~px5h|$Q\;0E("UkV{+נ MkF)aXWQh TF]l-A2,''_pk8SSOL|'$Uy{7V?_AV_h8KyVWu}:{!4"$k`1ShӴv-SY;?r.?y<. zՂ>q dW1AĿ{m(b[׋vdT+KpPZs?col#]BtF'U #FibrT Wb0s*@56@p6{NzӍtΗ*ȴwӂC#`h>fMMq\+*cA1YQ,<,Kn57S (iNP}&]jTcѧRK8g/y˜|e]_$ۇnrj$1f7<s/䞡mlW1XJ[ȃ=ܰ!Rۢ !9HxyCCUwJ<tsgn4- C#E-=a3@p;{, 0`GHv23߯ؒ۱lLr3խnvU> PrVV^~]V씊޳1ٮ]R+ YU, @|*h)F`l } H`P1C:YЃsl/v5ȉc޺ZkqLw^jnE_ }ㄎ NѵNy}o?SpT5(u 8* BJ{T pO e`&?FNڕZ`|!9pٳǃزHz*U96su`>1R>]%Uu}*&[9h|[PTFeTKb̾x;oTEa2.ƻFo_ Y?HZzUZrny&~<(UOg4g:e9gTϋ97^\},OO%Y= "CdL6{D\Fq^nH*kJ.pac#?Je˕N|wOSf<ˣߜ_,˕z#7Y}Ӗ {c~#yKbijt=ͨ]?|`(``YCն)vi%u}/(+$RΩbMPF ZxCMX]ؙ Ѧ1VZjpr_gZe !\5w^a&ҭ9^+ЎHw>yvOȥHηȿUz' <ih՘w{0ep=Odrܴ}:>?]xmofs)n_nA{%!~P..5@+$.k}=NU^k !t^c19ZRQp!רI ZE ye0U*l fUrq|'yG""ӫYPDaVj$#99}ie]GbTёcԫ6~-iz2OYc?Tj SQ De XY|}C͹*VON(Eyuvzz6ÛYzeQ|W W?|ͫ]_C}Wr{asrWmjl`*am\&Dʥtŀ|g\,(Wh ë,W{tէy.D5U;iҹgyUC. Qi"*"p?$P; X9J (pd*;UQ<8j!'ڕOXӎM5}H1劖S3I%Q22WK1qgRigHB=dur}%t@*c$?0$x`h8:U&Q VuRѰUj!&,?te]&n`VDb:})BTT &HQc 誘!UJ-ܒ{֜ut_A_)OByj#1T}-gc4jC-T&$W(Sb)Wl (-O\ΐ!1br wE^͹P˻%T3(qln_O|xvj[4:ĵ辻eLy򫭗\J./bV_)]eH*\ƈ8-1Dm^v^U,V" gQqW肯̕H0.Z\O!-yp*jԄ&kRM_|g֜=c?v>/ܴ=EC[!eH-]u/uNNN?/Ͽr-ʃWZ9lFlP؜;@*Fu`'Fy % 46› {Wv*Īj.UezoujN|b|ѱ׎v`N,<)Vʄ3Sf!hTprU(b+ZA 5yFXtpQCvbD,Je!ŧ5~81G5>W3r{x&8yĭ#8+ d@T i9&kPPJ+3L0|t)*dTU- h<2X:&n&ʤr ={<\Usduo^vÇ' ]ČľmPfc"J)B$aA ڑ&Y$E}N.-ga++wkFS@MwSe񖉻]qWt ݹ npAX"wऺ9<`OCMP.vNxMzblZ2=۲}WԖՊvCƚ}b%̑k+zPLk)"ΤڊnuşJpov{cdviWr !RQkP]|֐I#;[svH*z0`sMbW2e~9ekձcͧ|̍+_i+eՔ?׋_!*msrLT~Iqs]r!3.8RTc+2.ahf(t 9QNѓvf Ƃ';{uR$IL5sNu %omhҶ\0Мel4U-JdAEK~ZV>?xē6{%.VBСV# 9BsM2u C6[kd`;9.cY/%Eۘ+EK|홢ރ x'VU3) oB HR - #jbiŢ m?X:`u"l|;gsVMq3xm@P|4V@~dc̣ylAU6B:zlkN'av֠HPˀ)䙂Ml`lr)|ϒlsƘlIdB&.z0xޚ! zO-$;%Kz@竱T9%ZI'bQ"o|#;9\|KD~6hu p&Pfo6oJ>]ЩL-7}:*ٹfO%~N~{s| jmmuߧ_RΩf@]]nFm9N hJY_ 6Mh3XxwKdLb a0FAvk-B'Ÿ^Xjx+z@}nٕJc2P՚-CU(9+I+Zl U.+*zw 37g2!7! K+|1s(GӔ-|^&x-2k>+tw/H i8gBGACu[>.~:RAC7p ipO%ZыG Y&. j0G]mQS$18q;^V@L9[Dhv) mmm8~smzJl.[t NR99".)AX&Qd~2t^+>~x܊KI׶[ *xUdZKІ* PaE2 )šhq!De^ZGw'1Jr8#B>tue"E{  eփ$T2AikuOښ ƣ:|2O('@ .aa":R 1 QyD-(q60: "F8{+(x9dDHL<_R=l%@QٝC4qiw4R9ɳpdmz E(Pl0~)ߕb-?俨0!Uw}Wa(>;Y(—d?^swǦG3dmj8U~F? ?bY;0ke= nهvݚҋ7KN'wMjmOQV@ӱa>3MwgтPΜo$Ѧ qN VE%v>&=x!=iv )qesm7bh@u6PMQM<\87}̋2o7@ՇO#Pfm%lUSrբ-NL`Ъ&aߗnX_c'QܫP~ٛ+{$-|v^1GN7oqO/p0`/皥YJ7̶'yZM)o''-DpEHbq (1($YxBa'}ܳc'Չbte ;PM2HˢL(3ܶx@㝎`xСm¯֝zR䠭!}S=t>:ڝЉ9sb>m&.tv3Vzm)8 AUfZb.X!]wv9婜?r,_y0*D/֪8 J" ə@qL:\\v`6(N DLRM~4N!4!ʁsC`U#g9F-c,I99 xEsꅀM:iҘ6ZbMh]L1&7N{; %eů6-xUo;n{s̽c3.)r|>--|D'CyՂS.h Ka+u$ :# 7o٨߂­ppcHhs yKED#IL"Yp\H". `c IJ8! J$'?L+Mxj=G2Zhyv>8nl)3@Lp7oy!#8ACvmz,H)+*8J%bϼ`q#T]:_*U="TPӱҦ_\qX&9N١ ׂ 2]Tr8:E R<$`IOnNNmr1H Ъ mfM%G/{b+ mg¼  >˥szVJvǒzJ!<*#΀ m%3fTEëU^TB#p਄6Irk 18p햶|ڔNƟA.P;gvG?+jemkkԮ3DD +D72TN)%Qa$PsQpQpF8FȲCVQPlЮq#NPIZ !U\IDe,%EjER+j;~rBT_)"yu#vJw`4% իU<.*,'O4Xmӄ΄9_.Q|Wk`DHS<(*tc4[<& u' bmDl4X3BA +\V<:ɩQ[iV4OS:Iq\x2E+"Hp$E>褥qу {B;Q[̞ʓ*Қl<1q5錉".h$%Mz4oΊ ͿԖ()Wg5EPpHy1/|2*Dh"θl]\oDϫXg u;w/WqX`A~! EiW__/-+1aBIdm._K`~o{x*6m^iEr4 v:ʪfm?,tB$/$IZ@]2ŜDHE*'P:LL2shvMW;h%I0s郕Z)C*<#tJTă@hBZ% $g`#Ȟ!y>{2|\~*𮙶ehwAz9ñ |Nm`PɨJ*O(`p_#pjD~LJG}b;T@3s@m J"uiQ$QJ4޶Klc9(1C3b NPΡ ,"e:QD&g[Ni.pjvO(/7+rڪnlp}b;-7+pOhZm-Y]6f>[toh wN}̣+6H|ow,6.-Oѻu6nwM~H!r1tkJJ9k&] =V9ӆp!r(ggr sojۇϰ}ht~N>V9b8V"lcL.7˯oW:Tfn')D^+ 8.|W*u@pX^c:ijX;rU(y䥄֑ėjϷ˹Q~w@!=G_hTs&yR8*um3* "Ոڷ-WVyFȜl7 zTF{a^|C5#lJFDdC6 crn'Pm<ٷMFy0/0|Um}z;Haoo9.u9擽rYK2hnH̢ޠ7nICJV5%ACQ8m8#<2 ͌劅>qq~0A/?|dM8/Xه7& kJbl<@HZD^ؙ(&ou0ja"aGbD-Bj{I[W׈gouÈ]N<Ԟ6;E<+w'pq5^Rh <䩈s1բ4N1W%x]5&*&#S !nɰFkw-YF1*LTw*e2Nxם 0 "Κ{"nE!Egj`: jo.: i컑!^!ЙÈI L R0mc@c`(d5pJmz ;fJBڨ_9 aDsG] 79m싋0ahPB cgskȉ( fgd<ܳA@ Q_67U) 5.]Jlx}1NtkʒY6 :kYB}$2e^@O;q=mc\_ .wfdRBJ)O~F}D|^ %T/-R'!DaYKJHa猹(%p@9P0;1J>#n<7_Q [#EQ(cF }N cl$˓<2q编;?"{ }߯l 9?newO~]e;g ]"%"X2UrՑYimjce!SftDuu2^}h>F*P4Z6ô>,ѿbs,YoNV"xFE0Rl5(W .G- kE=v퉒FEOg%& zTF)L;b7A5;'Mዡ-فӆrЃ!7hr˴XΪcc&qMnWG)Q8uE?g.qOO~FӮzEDQ.qdZpe'f( cT5_$!~i{,xWXge6LhJNةHp-#ުt,\PHiASv`Qhh46 >{df\#j@RJv6m8],_itgaK1P)ܼ.lIqEjd[LؐWB-K jSMoe_#pAkLrukQD5gݙ6mR(J+՚GɪVtc]/Dbbe*#mŖNSh9  +pLMxPR͉R Sa( I~l!Et$䀘#bc{ޗl`M"_l|ױď0e= [s&9d0>|j/CW:/Vg7d*N!ZM7camdwlY>ftgPʖHP뀄5䅂M2lË6W{v}0)|ef` he0w 52m|O6-%eϱ|(hHX=À^ W rlC|fDԈF {'~`-b"F#>2ْV2d'+L#nfso:Uǚ:"Qh#%mWء+3l{L q.9 Zl$Uyvi}Ӷ!K194|lm5j&!bU.*.P#w\j,Oa,罍߮*()a"ޠMmLjޕd!![q(Z+B>$2fMOR8_tl9fYs/1鎹bWDvmŌ%Hň[%XL\jb a;%cUKJ|t]8O\y,Ch0zks$#>}Ԝ$2DenPQ U8S՜,(-4cc2JQ*=;g >C:34:qc[l?s%m菶puhY)\>W.y'-lmk=8Qݓ;]~9r54|]%#GRї%dzs#|6rJ!j@5vlT{bhl`S QgrG{>2L<NC"Eoyu ܀H'}Qp6ay5){޾5Asu7[Ѱ#!W{G|߮M 8]߼-g[#,gKL獦gh%pׅMH/F%]ȅmqjy6,PoCիzupP\ ,9( Sծ2L Zn,zZ9*W;?jf<_;ٿ e7Y/yT 'h򷿵SmJQI ٦f>TNsλW%;:}2D\hk r.P:=VdӪPuDj/@}l 67;pp/opم쫵︾#x~YS\vܭ+VeBʒe4 fVM4?d0{h-#RCKtܜ+4p'}UК淣IXBVPƀ&Y hsJ>-"PqntM8gg_x,w7lۢ󻒡sFs_sv!1}ʥJF"k+>[aK]\7nZ|l_s-g1I'-{D1s-3P30 ޺;^{*<1ec}g߽T{)crt=bG$+gq/2פCKu hO: 4l 3KFx5Oy n(cȦEŨxv2v!'u;|LV۾ܞ sY#B|t[?n?m.קm2_OzO{ۻqৣZt_}0 ,Tޝ_^߀ه:*rf#;*~n/5tkͬ[;Sޝm^&t[~s<&7gOҭtt5u8:`c;:ϻVK{3yWZG?=+wB?{Ƒ?v~ 0\[` ..OkTHʎ)9[Yӏ_UWWQ@/+hSDJm" ]ܑk%!^4޺t{E~->yh sp6#D鸻Yv7si}T˸G huO''-D <$H016 36d:=</36ok I#Tt ²(R$4-{SR4NzuUx❤?Ʃ>rr'tb`87k~)>|S4"z;*\_GP1Im7GiNON(o"g;{^&wqiBB8bLi3_Fs3”ዼ~40ԓˁa>wYBsn,/oC.X'./ThdcUszU?jRԞb>%d4 k;~~0w dlApyQϲ!IQ+3?ΕEVrQ^kfE4[qRy{?D|" BdWg)l ^|(kai)O%ĔN8l+X.xK?k~mpၲ%GβRr'u3vѯuxSJi7ʍZ)8ɕIWrkdilHe~lS簩7eChHtr´Jg+Iq|YQKgE/%xQ6R#8o)se"d.X w/փ1$˩LT{ ψO9@:Bg9BaHwX"gȵ{rMqo^8]YɽRݛ`{w#}66ȳaO㥇=IP*?dokaaK4- f7ڍkq*o([wi>Z_XGީny9*uzN-sV7lʦ,Z2T供=ouǙ`C;OZR0 B21)#4ډ)&ep$oϜNt0ހ"$W`v#.@ۛ0dP: 3QNk=FA(K.@Zy%7'gw;n򌲻ɳƔ#JGtणPZ>WEQDD)Q램`)J&ijb Sp[ Tpi̕y\>|}ZmJ0ǹ(@Eˠh~H}lVN_}葑)W$Z,Cϩ* rArkh!b 8jM(+"WAsh8dAyn kcKIR8)\6~"7#*݁-ӛ{ݕ ^hU<.-'BhMH2T[e%Sj`vᐨu #W., YxQ<&QiT*'sbmDlG/0BRAh5l%D'*ʒ^ty2\~zPWA+kop/R(A"C Eo1' 92Rߙԟa3JX7E̶] uIknwpӾ.xc;{j\%Ɠ**LhD)T[ |=u^iU?DTaWOWǤg&%gPqIp}!,ANJB!%aq[eki7ҳҖY`; :?ѿoQCo_ŏPį9HnUȭLaZsZ:@Ӣ]ц@-gJ5m6'1 (͵U.գE_#n\!ӿi2-{wǷ+ A˻/ח~.cEV_eN.{96Ny0Y]azlC )8j-BY5q[`I];l"X y-74|YݢF'$~!|/2m^{,m&I 1Kk &\oUh[KB54̈́1598]J=&Q +\'[+wvSM|E^کɊ]]׵=몉i"/:v+'o%s ƧL[Ol"V\TC6b^†_,FTR滻S߻ե4=~48yB3FXM76[w|q#]0d{uwZ0A.`л$mBMko:-/y.HRg-hM'mZE;c$:@'-BdiMk.L{*!VJǻX yŷ8*f.opGKEmJ w7O/Q}cOvlxYaYWm_k`mn mo=z%edHʹF]mJ]njW7Cz XFу %&qGFp&QS'BT] RV7X1ǽ՜ yOz8S8w6"6t3?/7 mڵ{:PHBaTH :r;>JizE؃MZa_"7\RG6.̇bpF??8ٟ3\WaJa/*x&֪zg9nb7/L [;o9N@(j_Vh⡿ay3XOiA\ARiМ-9$jLи5Pu̮<\G\ٗfs5Tu}T|E2P*ike\yN%4kKRNh@D@S9UiD:44JQb 'cj2D&eh q|Egi|tM&lWIXsOg~: 튩~Ş%ђtK&Dp$%Ded&NDhZeH5#v9]o"e/x6ր&DJ !萣B .+u֝ He$HaiZX@$(}dHIۄ`7r;U!6r aDCU1ӻ5ѤHkd&dTK4^+\L1&m $Rin>%eun~[%?8Xҥy߲X=Jp:';O.?(vܣ_ q=.:Tx*պD#J-kP8tp+m.yyKlӤ彣F ˈbTĘIP' |H_$sK~QKT1,1`Q7ٗ 4 n&8&dͱGG]# B4dhcQ.w~Z~D"8J̖шЃ78HY"q@XG݂E@U>h$P909PɅjԉ='PGs$,gy9CŪKc)xBY\9-R+Ju$s⁤AaVQ4"{8I {Dxkd z4АC`%LS+ݰrv޾g7 S7QK yv<*xIx&d4QrSѐ.Jn)PE49T~Թٗ?N%IY *%*QEi#K1HV <.!LR[ؗK͟S5u<̕&{pM_۴J$=׿,@PDL?IFִq`'b3RRzIQ%sݺo"9P*FVG#֠J]Q)ZcK(T۱w֝g9dz7|fxgODhveMQY{޲|%˒#]eԪvt4`+9L4 b Kkm#G_E. =pw8`q7EG"y$;3`[-ٖRGv҃8j6b=BzGdeq;= 2HE/ZI2Y D<d B+m٧}Sw460^& d6JBb\%R $ADԨA+|4%VOTw0:S:}͆.r;,MjN6 qUvtg9r$h.`9}t7}f\ GD}3B{zbWa)vlI ]$;:bvO3` pI9K"&A+HgJ RWGSA0 F 24 &\gƉ -,~Nے]omrS,g&ux,V1d)n !lb3.bBz7kS>__ZEd@d G&)ʬçd[zy{o?וOije6ϕor;]>eXFuD["d;;z[?|)T Pyś[,g['gEQJ~ӛӗ¥O0L?&z֓&pxbE2Vn&N֕ HLYaTZ)a'/wwJcavXfM"6qP.Ĵ32NDۖL{bȹKN&*QucqFsGM.GOPW:1XV?fAɥ .FB g^$%\50@QXqjLם G *4{(JmL\! o}]^a˳ŋW>ҩSIqttZC2Q T]94'p!He<=]@#S\0N2%}Ѥ@VK5iMy. .׾ >&} N&+0gzbº/h]lUXIV{49(B| w:(̘7Q$GtbEr d0;0e5I9AOFڨ\hWQUR >JX:e7t_E88b!XX&s:զsj18;HGg^(Q0ÑZoï9R7 zFOhh=zFO=SgȰ4kR'Fe#4Fiigci뫌%ecpSQE3#*fFQMȂKQ3-beΙ*Vnb}ic.8)(.C>J!6sBK' vjx?ҹهmf7P}ەga}p}AFx\mn7)=m,οn=Aۅ^]]4=ݵ<:*~6:']~r \0^7F74}2َ>Ns9At?Dكy8-AȦd-#wO/Z,<Y?3yee#~pwBP&{O}$hϚ SPkt7%%~oSY(Kwrrlױ9V4KRrJ&HlxQ*I3+:)\Nhwcٷ~ћSk E "ݔbL#H.k齎IH^ J8M뙘ʵwA4&K[lv㎣Ó^ɡo%>)K0#C$ez T,5),3qO9@̕U4³hˇ4b vCvרlRkBLVFJi#ҥDpe΀U:.C cRf#wJlǏÊ0;hiz:|'dT1;oq'E q%3!22aSړMY? |I|oeeEb#a{+ҳ@؜MqtRӠhH$-h=4OSຏΐ z8޽x:&I*ld'v#($| ;II#H[a}-zdv( :5E-y8=]IVK5xnct;ոBe-A%$%ZR[Q2{uYT>STX׫VzO3V5U?ffT"|_J3Rb>e]i&p5~^}1 d f1 `:nP> sfSzE' vI?: J{D)H+4璷[.S!86m|O<V>w궏փe48 seW۶C6y k#= ^ ! *u(>liWqzw5.̶UgzS*7wȸlɞFrZJ"?̏#G7$nIJyfݩvڿu<"Gɠ(́IݿdRs?{ 'BOcK <.g@ҳ@2vdј4F-$dFԘXu.m 6gRc2^er0D0BX 8A|'j9+:t4g{mrc^/L3Z(džaQUZ:D][)X.&;m˧ MqOhPҏq" ϐC[`x 4.F1In5]j<ٻ$ p9?R!$0Щ@ jcH(Km6EPfexB ##wLd%@8$ҐgsFũ,gզ3PZ}Lbvff!"= (D)"m)Dm#w%E1y<]eek"q|"{NQӯcYn 0g)R6&zt%eX4J1E: jՃZz1IqS@-Θ!h.{"=Bi$ tԆ䒠'QEq*ODE:K &|2ژ ҚI}2NW z0udLp3GdKZ&iݬ)Q7cj,r·k5@<(Cdx##mʄ9d`mgc_$X(խB$c 8\鰕NЛMX_72!M;d9Pqio4J $J. a 6y`5WO孔^_-j44d5!4YhdU { Զ#|Զ*ݼݐ 2N1'cxl =݈lz%HEՏMKPeAԤ\^UL.fC06ȁ7Y HF9EӏWrvD}%Ksg1 vrQqMP9YFJF6q,{Z̬وu\H}1ѽudMV%YG"##B#LuM紷4t@C9߽Vb34ND@sb-| >&:S0k1um}ڢh]-n.~sP'JpQpZo%klSw\|]{ ?οL/o\c2tD4EdP8WjX@ KkRxWVfe(dϣ`2!KfD Adv̖.9̢_u~3ܢCNDZZUn#y5IFh1k"U9P:̈Y3CFHy"Yi*Đ r e2k0E6jKQHE"ځ"Ն_~](hZH[C#J6jQ#>3. ` hRdqp"IdE`;l=pU7ؠI L!*z&`NJ C!BeXm85D[H/.܆tYyeG8ŝμ˒?lS9S\A*!f#958m (q^<^̻ZX}(*#+@P^9w:.(ZoNp y? ؚ2g DXEeSƻH"5Kht9{C%g1d‡E:c@d k@lQôl} Ű, 7 7)A%`qwLt!qrb%H +*^ DPFTI}YqH R-2M#>3WVNcsy5xB.ˏ~XiS$v!uX=:'gt]Qz=HL0,0CRK!Je 3^FbURGR!RMpЇB I$#$Rd9 goPfQzV s-L/6CTg$^O-^~.gi)j?2rךUt]Ŵ0VɈ2, ^G?+ 3җ|]Ii2tiZOdQ>ZEl 9r~gL3f&q2?fS+5Xdou+뻭\unG-EJ,_zk;u*d$뛫b)L~Ҳ\B&%Є˽yHo6C%jCXן ˬRӍ󘛂9|(ӎ,lO'*z*kz>}Y^"cv|e<2%ݺجOF]}鈺zx^=3׀|"MQ^M:IbL<0LdZI笑GUk=Uzew]// wmQr\5]^?{`<$ nu;ҺY=-lM1dӋ{5eC-9lwky|x+=o}&Hss{?#trsE<1-ܚuyg\Otw[rM^n猚Cs1&y-s qsAљS'oPsI3+LI 9#S6noSgOy3s/|zj53C H7eRDƓ?ٶ3!3v­q{{=z7=ֻ`o&꽣G,ǟ.~^e췰СץؿJK}lGfyz<6.-FJ4^įCKcnTЗWKq۹3vk7?}_}-?&_Zϯ㎎r+)VǺ]=owm}kBܵ][Q^BkL T#}ϻ[F2Q1zEjQ&*m۔Xt ]"sh/]p* l£&bj;[$x(Y9y"&B@5S6xu)L@o02X7LгtR0A8a!<+ ̖jdrVm8=嬕I̎ ^o8%d):1(o B$tT&r+B6rU <=e4X>@UVJQ/|Iǘ5\l >KLD\r`HJ GFK"sAx=yȓ6;l^'rRB Gb#F̞L'14Т @ը8(AyV;GyR5z7= GZN&fBfxJ~bwː5OBqf$A]eoFvs:6z1vD>]Yֵ!{ bdM#w168Y&)_!e,zF>G=؀M!HGrtaKЛE1O+kRm |uiOO $'!ay8 Rq&M 6%u&Y *"4W&1&$ g4лW XcM-S/wg.ڶ& qmG'rpOOezx(ZA?p)2b-Bo9&X u~ks[HrP1:5'( S]*[-ȜJP lf.@^[S>q\t u jEIVT2RbԢ&K:GKiB=!l[$RZbktk.dc45Dju&ʩJT!94n[պ9G/_4̮mVa3$J8(Zr%0 $Cw~!Ts:bRJF0 #0}lJlq1̏HX5K!2y6ؙ!m0\i܈:7)F(ThXBDKJ/U]$lJ,J2ej)5% `F` ]((kސR#5em8[hp49O}r2ޱ*fup(}vbώ_ꄞg_Ŗ*h5Jhi6hE!fR(QțTvG@SQeRl P%SغDUcOu5-vʥs_hXGqSh8(c\BToHp8F|$C*JJBjT{X$V"dAJ 5UًZ *%@c/m8!"mqo,XHx9)v hd'x[$>B(PY9C1,`0fDrv% VbcvfJjv&ѲgV`(h!e;Em8[oX>ut's4)y]#E=.kX(~6I:ZPNCgef68z3 j('$vqo~ƒC>{Zfڀ>A5w~ Nu\Q َH?S#j`x}LZkltH&Z)Hd”D~5wt{"1TM@ . CȐD]0) zފ^Ĝ*?ƝzZXm*0i֩hc(4Œu_zś:6k8]ާGVU;;9`KsޥwW)VflPgnz|\1^Lx5FeptQWɓA% 'cfbSNQWc&O;ədePV5pr2dUJ_3BJhuʏ>ePnY A^5 cbTBԵDUƦiGWl>1l\'#frrZBh1d I|vbQDIX#ebː7)/b%Yܶc`%eBpf͡LUV, ZP5({ﯦT}!]kW!PDu>ZSP=Gc.>3'MZ n"ˋY\Ӄ1B-ph($W]c@PbX,- 5gi\EzTb` M%O'J"DVd_>=p*ܼfw?o9c|R..l4l٨5[qGgH4e˂/8=+{_u_;]w/CqV!" wY`{K0M@L.ݿAn) +J$=W{Hh8tߜ6\7`qY:Uw>kec_^Y7?t= }W|[ 2ӌk:sFW׿Hі=(UhSu:xc\I:8XO99دm$i9 hq[E- ȷׁQ y֣ڔgCxݺ1"2K~<16?]/eu_fv́u=(ųm[)+jy//I˱<1 |: ۹MKWp|$&̮ߵA=<}ˮNhro>428< P|Vag@ʨE+q6S-5{[CNdtq{LuS[ak}DX(GB]4Eܖ){r 54QԽ6L|%]C~h=t,9:͘sř1H ?:s6>">TlGhO8nW76& JJ@xv i|!̜шW$Xq4#8,88*˪BV&$#/FP碕Y^gRh6VIX4 8'ntڬ (#)K~vs?tk'ۙ:L|\)7{M# -k-]g4nv]1ֶTrm |LAUv -N"Q" 'n')Y J%H!FXeJ`&Q+:Pnj_sܾyck"!R!&1 JKJc g (WI%'{N[H,Pbʞ e}-hk;Owt4a&SfnX:0D]kB So*"Y/aܟG囘kXk/:lCITm@&,zrI-ģZ OΕ$YU475bgzIr 1![8LJi. *}ªu5$CDv9Tq SdZ~D`#]:2b3x2碝VZc RlwH<)S(1|oIΏt 5dusTEPIa1ؐ2W%P7R 9F謗oз}M@-S+!> D9ʏNm>fNaV;Q.7[H؊)'Fꓷ΢`4Loa JT]n(*}Hcneh[Ԝ2@ΙUl|V+C0F5,wֵFgeS"+M Sr,d)(aXy@ki1b~{9qpr;iyٯ;>gEy2^ Uxs:LK ](-48ڣL P﵂ []s#lȕTXkW@ulr 1F6 )Ҥ. }>RX3G燸|JhTnvc.~5fڢf0U ddES3(Cv#gEufޟ7Y/j ŵٽ>y|vK8z/ʏz)K Cv{PN 巸^_yէb a99Ɛ\Ԅ25*tQeeT"F[FWssL{ǟEu6\95;wLa>1(qAaheAN.GGMە `1k <ݏ,yXr6xϰko_ml^C蝧bC$,:[" #*V1mGcG*4 (D㋃ ;> ڞ .|Y{pnf;kqz>hv.۵xh/,?ہ^]).Wk蹏,}mUp0޸x{|~za*ɫdVuAmjM:c=ԑir|I _Rw+Ta&K3mDۤIM#b;m&TgƮf`nf :#M1E{) wLţ%D>SڭN4YV1+!dңivڬ$+Qޤ*CdC #wi-&xB. -hD<^bx6;[_:G|tB$bnu4B̚{&tK;;9Ld|HEb5Zڋ:fxyF9Yr6 b<7'MUEzr7NTRZ<'JIIbE1\2'%d >zoa|8g*α}NcOI=k],P0ƣߍ]&|J'{X@䦤1KuXg](zoOMw_KͲnGdfɣ^f\ttSk\4Ȕsk|FmxJ v (*6 :ـ-wX q99ݦGDnDM2N(j]v H2hFg-K75[]J28BY!.tA44ka7Wg\C^- V5dyA`y}֡smPUPz*IUJ'aROmvERilG?!o`KU * 2VjVa ,3*d68+ v id9v4i2yn vEmetj@o]\Q?A2n@Ơ) uڀb(4vyNc#nC7!vD zcL CZHL tfM0yQLkSr QtGdUmjmѝ Dt0GQv2/ͬ=kGRA'َZ'jC Bꕶ#r{&S^vϊJ "F_R0q=iD^n!GAO@H&9%J,q@X:Y UV+Q۽@VQ=j+`"Кe8M2yoqncE/f%0L3fD' G1!EIJ!䄈6!dUs7æM' pǙkW e]^AIto-7T476[Бf%SdohMW{H Tb0<9|XA̬ƂBgs՜P$rDMWD&deZP[ՙ4 OݗUd 2inuG-8n2ϊdh'T}%y E*ΊfG6Z/kI|ӧU6h=Ŭ<$Jxҗ9t IgmDT2{(%gsΊmB[viXp|3PIȚ\\Ƣ(άI`&#EO5ͮJjY õ Ƃy0+ !.}p$,JH'J s 4n0V =li`F+fVo- R)ץWoU4"ZF7+mU(am Wz%Ԭ*@ķY J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@r@/GKo]ZOWKZjJ׷7 ߿vj~Qϗguwg%\@>-Fb5>z1rIr/zuX{t닽˰=:?Ϸuj/zP [Az72OK,MӋ3+.v6^?1߿B^(MVof Mo#D}5^)`0 .v9V;* '6C)bcVÚx"|P S%Vtm{>hXl6uŽ~F{{ۅ -4}0E=›B 2^\9]LJ?Z'S2>a>j01 ٻ,WY,~`7cL_21t %ER[$%QݢD%7զbxuܪdnutQVsa<Y{ʴH k0'O&g/Q%-q)47naLk7ke k\"Pxl^.\ Wn jQ5^]t> ;GzOjDw'@]͑֡aDavpڔ]ai3Z =m8 Ii,#A뙔6ݲ˯Mo?ҿU gһcYn챓ݴfKպ\Tf5Yެh Z{-oɓuw>ٕޛ t28 yqK02`[\\010TX^Y;*gý%Y#xFk#N |gv@ B4agPҌJhbI/))RtOYqA9hBiĵ!$F 9{elG@Qs6xAK`RާVb8\MZӢ~FM[3ob$Ł\*yXi+JiT u)``N ,xlhy^ȄA[ܞ!cp{IYcCdQzpPC!һ`9M AdAD׍yZ晑x VR3Mz/ցrksKW`}X-uZٮ#qHdP'ܚ ?AL|)\zZ<(3!^8HXFfyRxJCRl9֥ɔ(0 1@`t/o,xj)|P]Gq</o``nKdHۙ>L=J#Z,ٽ7~6YzS̷ xQ4WGkjJ﫝 njoQԩƻjU[T)(_#[VI,9KL:>[TU׶j=#FW^m S-?._㓗qܔfM7IHi _OmG '3^($$k܇b꠬4u:yzХESH[s?Z<-6EOo14:`JogUerS8EwjyaRtHEѦjŴQxPlzЮK{ [cR;ĬW逰  <Xe[ TDwǃmh0%0Gntri;%]z=~:DA㍾m諱Ӵ%]Tv]Z/jk`]dܹ?&Efro kո=mɰ7ہi=Ӣ42fZb%^z:pi5p-@KZGK!hL8P=tHbaJk=q/fNhߓB tR8w}}^[Lϋ}wu5]~zD8NꠚłLc]Ѐ 1=ڷjuskn<<فO[oc!;˳HuxuFbqlj.z]6*]b|&ZNǜ/ܪ46Ϥ "ʮ) gw tRt:Ϡ+Hk\0)Lwb1~>=#ȕf`ae PR EC3S44Ze)PʍVFudXDPø#aR=Z 4%19w6qB-xf"TH96;˯@/aZWBO@1b*F/`S{c|z̤ZÃPG @Op).}:&x#L ,3r#c6q#c> ͌k0`a脛+]b-#mK,;*Pdi2^^8bb%1`Z LqAu!/`ZH؁R5|աhd I3 j-)B:`^e[LEt!"'U\Gl;6;Em=8p%7NY%F(J5š@SE0d=#Ah S唷Y ('`YFz`TȁR108Tב0s͏3=M: ⷃLp02TG`Ke C*ȁ a*S2 s  Jqb^J8cuDJͨM0I̥18m˵q|:͒sqgE<​O1y),|G i+Fb(n6A-%/!Ɍge kIsn~J֏?<~Ԃk~ŲB-bq;g"!]uyZ?łp>(/-hzf ɴB0 I k)%#`R x_8.e,&&^^I!{c`Xja4j0E1 g2IŘHSF,PT9F$㑅 0b"RSF@-a$EV Ętf"M~O k\1_>?Оw4ta|Mhyjw6̊?ǯhOpg0 JdUj¥J@"ϴ4ԁf:d@̒qú&wU97b|vzˠ>Jw?Pq՘.jyĉ fO&(_#\hl(&Dn#j:vL.ԁDQΔ!T9rq^T7A U9&(sOmPPdH>-(U2H&=zft9K@8C}1{7"8>DKӥ2+55)6?*1"LCQ"e̲au7d9q$,wNrS9yN_ռ6~e~]`m{PInmL }!dXF V4iP*u8HI10ئQs)µ)l_yjңnN*޵q#0pe*`&`M%GzvL1ZDn3MCޢvYzmQuռ}"-$jԜ6.ae.K4]f2kZZAm*|!42 3zI 4V`V9TAe1n@&hu HB!57şNG#5R󍚳C7'ܮZ33uhi._r7wZMC?oƓUǺZUl =GΞ֘ O,[@]䳞@]]֘ybmL RTJZ햼z5urIJ,#&Kb0-;ءVءy4%w= @PaViJ > @R6 ;ͽAGLHQ6l/5|5ĩ7YfYTa||췂Kn5:z$tQѥAyhnopDhk28ڨ V p8zH=48RL$4EbH (%]^UR%<)_P_= c05."uע[ZZMy6;o1 n.ϟ(RChJFK5‹$J")Q2%e:-bAy{QzttNÆ<\Nz2b/y9k<Q?Xʼ+sqk4<ZM6F*-#2d瞪]b*F7${dZ Ɯ=UiB%עƻ$S3#Y:o5ݢ{OVyI{CpzrLTE{hyaS4dSȢ<6("9A).d%뷅/}4%b )Ig_UƂ PdU`*Iܮkq֧}29)VH=YӱoQ.7gHD)_Կ(m.#RLBx\O"ޡzȾ@P0**=PF+vY$0̐,i!x6Fq) Ƥ 3"+i'ͳc22eo]dLE}o ;[:~˻$sVL^0,lt>hO; @蒆轈ĊXsqo4!Y隟,dibrDu[Q.6| V%;ЬlN,sFDA$NJ@VNCw;ѹNj\ľC4n#9gV9x7&됼-O&(C}#38 PiE(1H($YC,H(cR]& pP~.H,g%(C?s29xE.d-4}6oGNJYvg4Ŭl\ϳJgqJ;;z=$9mREW/ _vdRl{d՝ٍOOuY@ş!) c#T)LJVJK(d ױ.q򟻦ZDvJj J0 uv)X+Z {8<{@\k`a`ƺe75\͂nP{#km)"K HYZGV (Eh5Za.rEɞ<ؔeӏk-sk*N>\3)`~vEd`bӉu^ڤ(Z ȾuD>h@SX,w'Pw|;Ҳ-]ҒhB2 %b]DVĺ/ Fj6x[`L-,mkN֜BY)/$ͰZa _1Q&B@Z$U BH%;+ڈEʢL`$cA۳֝UaݎRł&lcdk]Q6R3R9( w 0Р,,Is 8C2 fKI‚P:">cf]J4^_-ز vꪢ]YߛJO;*/w=3sD(₩Iݹ:kx@egؿD bCVmQ!dVyZM9޲)*Z6ZQdzb'[@#]Jl] ߀AHʢk=_|_I #c1::J^zH,Ku_p`#l||f½*c}>pYjWK|ڕ} 8igz/r:ӫ 2D>&7c dԭٵVG 2 ~ ,*Ví7vr }T+4kΎNqg^+%6{09Y}9~nPyZ^2p妡O,[މ^%/Os}>8{ ]@t`C$*Hhl܇I8VWmTIJR ӧ cta̞p Y0+#:yU_櫹 ~~̏&\G.F7 Ϻ/J&/G ~u_y̧?n^;ÄxN飺5>1Z*~T/ќ,{yX=wi"%;=ϥ|9<j##[:jo2nvn_/yZ^x>B nnXvϵ0?}eg1sH_!) }v! 0J3 T,Ey>H˙ꪷh}w^^ͷ?G竷r|W#={m@hY=yvou/*;n_ MBȥa K>fg+Z'&aND3 gvy~J{"TjOO q#X /ψ1'B&4R|1&HUEzr7NTRZ8'RIIbE1\2'%d >„zs*ÔpǫdmDiivJc~OI=i^,>!F.\>E=uk,R`榤1KuXg]!(oOM6deGގ0H%Gx9Qq RNiE@ +1ؽ*ؠd@< h-͡]K68U& 'u%j;uPyU벫ԕ htyֲtX\s ]cՖ%a&еYec]Q?s%n`T^!LQN0/GRga=g,*T*JvdZ2k`I=RƌucS1b>n]84i\Ƞ xlO{ukwtu44U !ؼ($0D!1MfϰPM>m:@GVZLAG=$]ItRU! 0阊7%֣2:/3(Z RI"i Yk2a(qcq{ OݗUd 2inuG-A@8n3UTBN5~d}%TU;+QqnPMR V Dw*<}X!_̩7Ajy1i"!%:]%|̡cLBP=n;H HjLEQ{pYb#.n2֐h NmB̕޲N+}Ƣ Aa8J "(Y Zc~Xdי4HL̘cQy0 !.@IY6 crĪMPk>Zl?jc%NڳF& h%=(D[vNҭ*^"BZF7i Q4/^k:hIYT&ƐZJm㠝nse96-=#b$Y T5"b:vtrLC'KncQ$vGk'QEŨն[SQt5(U/ yHY=yh4vM ƤѰ[N*3bO*Vh &% xKd]ж+9]PnDVm~nQD{=D+YLw]zA ,3 R4Č,m @)w!n1֢6Q1l)b*'+ IISL1. - wH._* Q I-tB)* jH6:XLÎg= U4+~(]IڈLahAgМۦwZ,̸R;5֤Y 6)jP%Hn^& &ci@Zv!; Zx:-] v%5ƛzNClJ؝5z+tVB  '` ae+J V'zI}!)3"Q%qXoCnRM5r7DL0r( ;ff5ԤU|IF iZskW2v~7Z][s.gd+h5/Xy|f~ NgGo1[qήN^9{ab>DDZT~\\]6XBpyq~?]?~.#}Am@*!4wBhLM lJG:cZ&<&E7@&?eZWc* ;[mox?VdGctS)h٪j- doE=gbɅbZ1 ^3.6\~?:Doڎ&5i`i"y [x(G|t΄bhhkrRA TTN+x8?qAI`(جJ c3 R6}d>]Ul>[9OK]ch߼8>y.N L]Ëa߅Ύ7tdIC!QB\Rl+z"X-_5Xu8?W?|Co~CO>l薍???O;y Ok;?˄w4 *dKQ$# 6uV)B)>|gz 1;rDUf._?Cnu\ mnvSn1@3> n!=iBs5{.g[ D1ӏ4s{ Ej ']UhyrK/S`9Ng:E*>=m'JOȝt3h~$Be* ''B2^|ڶJS"ڞ)Y螫N{swF(?߾ʽ1SиlNXG/ZXH1^K:@.[g_Z}j>~ٿҿ]M~ЭWB\/˗h2tV(ċuYȑپU[w_z%gy*On%݅~\궙+\(UWM 5D&{칪.}%3QLUU T#>xN6s7yT%:sx\sPCNZbBrm:5jAw7׸dlγ̹<[ͫ?i6Ͷk8*g̥L.욈E) Rp"-%i JM2At Z?#C,B˩k-b,;;G6YeM]YjM')I\'&zbyŨIIISC3;;UtBNA!T =?{F]J#Tn=̧&Y YȌ1ŵDjHʲ6[͇(i$.6qN}ZD‘#x,YLYd 46g `8Y_]~6ٍy^n5d"}*(gp^nZ=j6J ޫyZ!~|SILj2Ȼ#:bS /$IZ'5fжz+"4i?[zг(_59VTeSK;@=%x(#kb!͖2R"X>$0G=X6L.o$XЕCWX;? ѐ.噛mlrIAyv:$z/~RÎN>i]3->#DKm`/&CvSБb9 807B[d{Lrz4?kƝ]旅,qUVzP|oL1%~bM%k %w]p:P,my OM,Y7sW<5:Ht4w?M7s|ף_| ") 2wuqzyƣe!2P'Jk%NѴP̱ ;o u::K}鳴"rLN%MGx>D{Ŷ"+3[ s :J}˥O`JXΙ7.Z*Zjoajdk Me[h[p=WȊY,Ţ,gob@'Eb@FeV J(b⥁)pNUgvDJ[PȞy1t(DF1csVR>hS{t[l? ]M:ڶնH hKJ@%Jk#[l)]ft Pi"XU{ҎA !҂ &[ 'n"$EX R(Rm:aG't:XZqE a,`wJPt~Z{a()0@PD`$eie!#JLqd5ZWզ"^%zԞ]*W:Iɱv*E`]y?`6Hzi,{P%l$)$B VJY`.f淪tkye{ȏO`օwT :u>;š0. !Lu+iW66s_#˦,3n޵"ZU41sx'wTwR{i[BV kK)Ln=3^@ך~} jljrwya3Ҡ_3$i_FDzƜc"XD*!g5cx1(`UKj2[Z};wGvt27ؤkU;W]N[## w)ZX«t惘័waj/c_Jz\LrUt&`0̜e1Kcri T}ltX`5*X% BF3H0 ngF(ɣ.Ea_gV[e]]64o;44u٬i}Ɠ_kfWǿ~ݿ4qSl7h67/oL޼i޼[KRhV[4QpxMv~q+j=xCb6 unbwksf; ~!_ջ@oҒnGK IKdW3wSȄ!p}S~N7,Ēlչ&\ռ"Gw/iVJ]ʰ>q)uJo6\ O)$R?Bf6zOdP}U`̃Ҥ+Xj}$f]&%nukO~@p7o}b Q$N2a;fڣ)aZ|CB:ijq%F2sRdtKty;Lӹly.y+*Pfy^un:'KTd(cIɬ&<-HZMړr@LV t9ulJt5lV׾8uznvͰa2v Oa7[ LOwQ \+h(sBu!@I,Qw\I"I 9> m r,;+PMnJzRDYrF5`2zFtd;q tF4t0J`P=134Ec*@ p& QapA9jcEØRd34 >-MsKߺ8* J#] ܐ7:A9#IEpMθ4(S(˰U1|Ix=J!*p1\F SN)mQ{SuiUߵD%V-V+Y ]"#3 -qP/̭:oV/ 5nvDn3%#[kR4nmYKK f3 >U}eC,q3EĮ+B9ѝrhq}pH5L8J[ܵ *Yp*>78gG2'Y qtY& XCGtdKW6AhߑE&N'{Uo)GoWPoUmmo+w;1͌|,sOyoJBU0 X:,)N:roٗV5I`͠Uw޾%B5\b9k~`?bﰜ-v7/o$xѥɸjtL{e+Lʭ!"d:C) >mvA0pL 5"kP %7V*SVt. mtq9?{Qmj}b[RW%(0}>JdSğѐh:˅7!2&hy9AM 7ޕ~Z.p&Yq*\sr52(tT90WR8k0󮸹W_p) 1KUL$K}[a-mmPƬٰ!5ACݭDZXFR Q!#C1޵6r#ۿ"Sr1m`daw) HE-ZOL PSlvXU,pFt&qy;淴Ϻ0^u <0T@Yycjl<*nHGi`uRpK8ZOt;"E RrȦ JafXTqLx(! L޺;=t>O0LRS'kW$o!Qo4^sq#̉Zhp!XlRQ{"/jP- Am u? 2y!.⨥Rh` Md" &CрVao#[KIa\殯p< nW1\.?I̯yє@YzJR)NxnIORZ3e_n0}B ʿmw-ؿs,!vݙ"v'aH.M%;k0@` 85ia!ER儢ߧ2+CA70%J܇Vy"ӠdC;OS65N5mch5 ,AЯߕ3Ub׉lj݃vҝDrD/,Œ\KXψAy : !)'<3ga[ܓw̞?ח6CF,!jb w MX5u[*ϡǻ- W3נӧ40lr*nٍӤ tF ejkU“H䵖!u[\ Crnw`O=EK\ݨ'){MU0z;%L|(L}on>7 !\$mST~;ixq[Any ׵[@>`wgpZ蜮a{a0G%>lɐlM1e;Urn|o8009i<9%!zUo8]|o;1(B ֋ao|OPC;vGmw=81@N8]m|5mz=W Q bs7 ֝ :u"GH͑#XJhxY3*U{ch L,VCHm b&PJ`ql) XbLWS+ēΓϬ6|y962l)A;N$Ө4D+s >(hQ@5.a^L$]~@VǏ?3~Ưv0M0xeuQ|>׏apt--k6}C*A ZLy5oFMнh?MT,L}__Ck VaݩgnXLkB}g|c%|ަx4-Jޢ-Jޢ-Jޢ-J[-BBX y`!,_1LBIP y`!,䁅<BX y`!,䁅1,s 3nB92i]}T 2؎`#+F 2c5.  A#0"'liĀVa)@s{{}J"S˹JZ1֊Ql#as֚F镊ȩ14 XjEΪ50\j?̀]QQ80I$*Ű[)r'jlf69D(UfR C}8ªj{[PMFX}9xuYH숀J[Y>eJF(1 | N9ťp_oW1B93Tn˘;-c>_%_d.xb W,k 'ϸ#9{\q=ǣ XI FDqAu2B^S5j!IRq]hd0 )3 j-鸿B:`^e"*֝N'~-<M;:jv Kn`*Cp˃N9b8%\D'g$⼧W#@ 1<Ä`"%$LKQ<* Ҙ0[w&_TY$x61~iY=sXcqEz,"3LL# qw2M8C*XWaNZD0DAl8XQ/an%;cuDBRA$MFbrf;-ڂ]̦%E.b]ܛǠ:RF+Fb(v6!{P(y]<]<\q=$޵q$ۿ2pd~¸" pODe)R!)+>0)H1Lt=NUwLX ?eϪ}ݏϴ[br~6 IՏf9M~_O0q^Q#-B'-MltL^x:~uA?&|&SZk|z1*/ݳ9({7nlQQZAӎF|W]qy3{7ƪ#|wYW)aJ:/ɢd [NV8hI$mw{9fY^(:e^Yj$ool/?,j/j5|v7f5g}z"tu,(L?W_ĔڢHDn3EU4%iw^!=DLpv[&e-vRM!X2*CdIT^l2m/wGKz.MhO{ }bSr""H*6֧ںd=l: gAdKRDhI!JCQebKS:=ͧxcJ=ԥ1~C&El!6R+&7!~JwT(.}4$I, -2)0-MֱDme o_٥'Rg_7vBpJ0~}QO=N_>C.3g\o.z 9ɱr .!C =u:dg:eU*Z0uM*DîP @qT d<4 {^cybs/#ל1˨҅JbB" em@;IJE[:_?n[nJ~Y}ڭo -! `̮8H±& +.x +" GEZ_kt9KT(Dڧ~!{3 3ŮB"fǶpLԲ_C Sۇ`#^qw!6=]tY=bo=ETdE[㈊ %D1DT-ɶ^c""*)*(bC1bIۚN` { ZHMJP S `:CZfkLID>!RNRdtb@jfm8;ϲ =A9+<-T#Ipm3)?jEV_?-TefvrS\h:a=f7>GO$0Fᲄ*m!%ŤGT1>D ۛ殖?? 8 1X8/ZNy>EgC$rt -H!)P_Kd^׎NUP=XVb†ykzֳކuz!YؿcBm+xOK}%PJ.EX4;ii.,N5x9@*sN?G,,$u5dH$e,Ivs]NpyqLo"ٞoeࠌD$he(AQZ2*II_j*Y$$HrPc \ov}IYuGR;]~Q: Q76$ Mi^|ޘ QPan{!V';{!Zl|hFt}Psu6\; Yqg']D+XFv{Y,SDSA2V S !$]&O[6P ZB=&ܳ(tw6nr;/\q3b󟩃DGMs;;ǿMbn~O[0V75-g5B}p*nՑ|U|E@l-P7`b ntA쳋+ 3A8RlJY?6nJ2Lrm-j 9DA#qSCޫpH>(}APLKrRHt֠(%*QEŪ[#!Rk hi *2=jMvR:o6ކl![>$YֵvJY[|X$ob>%Gݚ㍟ozTVS6B9~ESn/3Q7##,S&H!&DP)B4.HD`A:BZ!|kMN X-pvqHm;aOfEQK"SdovX=Z +#U%iQۋ.砻-eC6'LB_.3v) %xgM=ZB9 1@Qs^-R|06*J&8TșeIR, c=8_L.m/H6nzBw+x-ԿӧgZ(ŁgьAmP(D>T).K۪o4% !H`08VxTP)eBd*(Np>q@?˶m>YVdV'"g, *-oBCP-)V6iZ#< *~ \$(Q~{h c,p,R1 ,?B6ڤVa,1g[i|R"v4$PU`L,GG` gG^ILNLKv:E @B`2;* "VBS&eCD ]Ib&&:+ߐu`ݾ^a6`3*gI9gv^98/um0x'%URdE U^;}>vUӸ hP[ʺ{E*'lАc0)cFy/!`HdyƐёYn?kiy;I$FX<r`%8 `b"/UYu+ AU}8.Y@'MNxPBU[F"h3Dž2fG x)a{M?י>NƇfG6xk:IӬqƳqJ;;;$uڤ/Vq}黋 ymtcj.CqsqH+m]ǹyCxdbi.L)8֟/sƦ[Vdom"Ť{՛=i@Zg>;93˳Q^uS'带5kNДz7_e-E|Shvy4k&28-te4ڌRت渄BPGY[HlFih6gr~D.[O"X00\)6q(*5 z_Rx̀'/?_$f [ErwD^ݔ/hVkҴeכG{<<*S纖e.Npͦbb_ސvH Z=.٧W0:j /uP3&,OS٦ua!Idוb|QJ9i=k% >Hi =M%j >K:lzo뱏D|vkC)Q0(g!#v"2~ rfH|w(!G-EEN%!fF\H,k% y/D~IxmDJk>1K)d 2TXw2jSkmА+Iۨ?*jZmNr8{ݴ% Niyq& n:kg<']LU#o֩j򤙍.Y_E_|ι^|F}=Thi'[7ތ賤bSb8 gQ*-s1DP,Q4S7adrH!G#$ONz#NYb;bfz+ (9r6`w2Ae9r/45:G^"5B(rIhnc c9 ^C Xg!D)0ĥCgp^z UX@-ҧu4J:~i>PUHɸ B@kw^X!,MoФ=u 7#ƻb1lAE_Y!rTQW"mEgvc029Al]6QIuWMֹvD3D*PB #E_E&m賵m";AeaID C[LjfTJ)Ęʓ:;LEAzr5 K-`h \oa us4?\Y|uY)=_˗=:z!tʪtϼrsuHt[U1cx% $ x=$zH>7$wZU q*E[8hV{+2Dr,U6 B]D'U63Hƚ~fU? :|煠7ʜ/oAkjCay ,H)*ʀc=9O{U)C@ohGRXs7͸>W`/~Oȷw.;ݒhs;GS?(^7xa=$j/7^z_ J` S]$L˔!YLzʈyMw b{" %aMʷswSqM}j0c=h{ Sؖ^36f4ZV?ϴe܁bV׋uUm\}W|KgMn՝U_+טNJz*!E!b.~SnVK+5/ˌhq7ƍ02}K (!QNR,q֖3㊃"ҦdTsM'UjCݪa7<e-\O~Ybi1?S=]\{O/񛙛_}2L+˭WpG+e^YIzO[3i͆Qi.e+^K{Iz/I%$tbQ KP^K{Iz/I%$^K{Iz/ɫ0(Ej_,{@κwTU%sqPBD*9TYUYRϺ<g3e)ɨug#m>u Z-d}J׀ϖ^4;U(ׇl(96O u3 8Mxq'B.O̗Z?qĆn 6pw6kp=`%:hwöGZ/V i'FYuWtts5WF iQ]@]2EaU"7/$ JL2G]o uXp׀\D$KJW dL4=D锨#,Jrk)X ֺj.|&|0sP@.\ tQ$QJ4pI1Q 1`1w )K$EDI]gVq䯼] eV0/f_aȔ +7O<Z|Ϻ{ńl/*/T:fzu z2]skytire t| %$*ܠШ2Q!)@F8F{y?oW bx>ŀ$d,;u!TA{|.5#UeDZXJ.c)EP1@C"YjƕINΔD͕P T2-6i<tkͥP7^;zW#Dp>?i649Heb5eUt9xuGC8_ߗA^96rMwm:q>u~Q˦y9 ul' gs}gRmdTz侴YOS֬E"JX2zKͩՓ[ixuyKveGTԳ(@YbEor%EF`B"ֲzycBErpsx 鈖^r$L`% Q RheV%#Q҇EQzytUfDHk.7vLۤi{;^>ҧi]o1|'5( ;: wk̅#J9'=ό~hm5)_VЯenô@RA*YRtr JVN>i}gM.֡R>I$b-.`^j P`@"P2.FyLHP o3@#@2 \EOr)+AT[NZ/~O5Tۺ]#>heq1s˫/Su,Kn^ҐgrWӑJ RBJ3JVk-&*xesvdo',ë;TWwȂ!kPFX9ޔn!RNR$ $TĬCfL_-ڡO7p>aP̧g.wA8X (m@F0x L{K%9V+T=wMQWqsחᮌ(Ke1_9srvQWfP ie*r\Lʹ|}E;8q}g$uΐla(*qڞj0h3bB 1؈^(. 2pHG#Bb *uRi=֩rlۺ@̡n] 6J %FPQ'4Ǖ4%B$iB `T0^=qR0,F2́0)XI .--00c3bT,(BT#B'=:>5FlwIkyp']2:CVuJLx.N)*L )XQ@G4hBrR @y XB,cEa)  М T[À..1(SA]2L.:D0 ZmͼQ|4o"lO#򥇧^<77&8# &,QmP[2@fCX/$s $Ip/gj0 p?!OBS\88p? T7S.lQ `k6wΐB9[ i@ SuBy-$\^ٱ0_o$r(ٞo{QFޱaMJ6xFj`<+d‡J"־N0Gcm\ wF!r;ܗ]̪bMalUc@|Kݲ)Ct@&dlbg%_14[H^+3UxknQ;}vRm$[Dn3MCޢvoI/Bo˅SUo'Qr~,Ɲ]%-ѪOlJZdhY>Ƌm*J"@:f“EErZQg+0t* DpdwBɷmf$M^w(w=RS55?q3a񟩃DM#wr{_Z2~9܌w`:m=>5Pc|ww3L曞b &mQ>R}$He-˯!UϖU)RQ}ԦT,$֭Xv6-+QK$=զǜ*[I{D%}Q)қAKtAdk(*%&(JpA+j I@m+]M"AȲGƺ';JgPj Mio!K }Lu@ae>Cj=jGQơj;6"~'e~]MM1@̏1j?ļZ_mKF]%/ꌵEj$V+m!9TqRKCvIT]QEtu$:K,BX+bvV,9D92XЪDl$=W"kDgs 1zК@wTJ ^o)=ަgGE1KWs$>BsE6y <⨤lr{U\FdBQEK1~sH ?s@LAQ $ґIe"[j36=Ina)x9ۣJ)NzuhJx>#;R;-.]&]7ߝI|&H]ٚw|xS)۫Tޣnާ" } xeDт [##)"66*-sQ 0EOݰȁ2&e VCa!շdm:OKJo7Իp?YQ}"bRLO'ngi>0MnFlX9`Yc VTA&y<(B([s%b[' ;6JCPP1J%T%I]aN`I{[k:OK8;&HިP{~}-s83!ad\l.Vc5Osf֐`m*" F2"f- ^a16 hf+:)r,k2d%IL*(>bfD%BEz3 `<}(&!$ xm g XrQ{I d1kB5'o֠l |ʎyʙcBM,8EpR;$c}7jk:OKě ⥎,g특*g\}`E1'U*€`2*iR5AޒהQ@);?"&u*eP&߃#Օ{v;U#saO8qFe;7d?~e?VvjP{ּujY>x j}ˆzOwIf)WMjKj$ |1ϖ|ӏͲb*%55[{ȉg_NįWL(f%Q%j2n&CZI+zi`lMB2,#CwihG~ȵ Ɯ=UHB%W.AlX"dp4#ڑ?s"^ZMk/;OW*6,VJ{4Eӟi )㨖T ,uDذoc>'Yː,0[{1T+z ɛ0 b$P4ɪLU;t8sD>ˮ^k, [l[]1g[mLWyV]7?8 m[mַF`{YpfIG\wgXrc * eb$c,Sb_Fi4.rJ%nmL ?1>#Rd8"Y*0&S-#XFYEd/rަ᷑]gOَeycİ`VUI R^KBދḦ[F>$ ];h &&gMTϱ*ߡu>ѦbV7Sk, esΒtΨCtr! zmGS d}lQLm#(&s2|P|:f}Y!$sdczrl1Y#|)D!T?҅ Qv;N(մW?SwTk=OU `B$1k 1[tEp`~ H N)b)gA* K{)϶QWJEi̞9?9,}(0hUƝ5̪r<7ѨquvTma%^'ud+JciCqQJm]ƹyI]~u4}٤m3ns~9i}3ZyI\דU_Wm݈r7#(/:ɳf1n}9y3/ДXz7_-T:$y˸L&ӕ;8-@o{T7ڬ&ZS@KH<**ٌt6gruJ#߮H Tg4P@\!mJlo]/u2%|*-|IL#*\N.wwt_wGIf>U5Jj6WSlg\;vZךK \Pˈ_{!s>jA)s:r.m ]-+w:UB'+i'FZቓFua$>A(=|R:u %/;qZ}auȋ<63YE_X0 $B!D Ֆ {RtUu}wj/ԣj_M|&+q~t ?1V~"jg;CΣ[J:Yz M%fm΄![<=uYPuYb}TF:S`PΚM&JXً%'Ic4C{i0X~S/E9hM8m![ Ԑ2! ?gM,˾ӃUcdIrLYj*@&XNu$O rj>ok5}6+%ۨnz|ikJiSsrsQ6xg<'듿}̮'yۜ曍+_ن0fpv}2e8)cY\ zjs;PxnN#J8[mNlJTlubfVDB4ҳ( /,wHo8p:Oxi7+636du^ %#(TITNylF.2BbcDD6Jj]+UIKHNٰg Ⱦsk <LB#l"XTFQF{Xl`fKʡ .Dozt}YWǚZGXd6 [#UJ暔{qF+EflI'{ xBXe~ȓH& L- 5QTMP,D&%JwX:ARnE  C5-KJb\Jz.R7I\YdHI|!pG WﴍEYBY1<&9߀I%\hd&TKkŕ)ƤM0AۛT[>FP笢cM@-S qЭl)r "UIYҙ߾P rB\Rdt"p6Vg2:"jkPH7x uTW{+P2O=OMYNL'"{Œ hNf2q*'d6?Tͻd?uVkWvw^lϫ+_۟r~|DP\ B Q^k"V.]q:j%F"C&h)B ˠj,Ԓ; @ 8R28jlw(BOÁ:׎KKZtT{Id{LFwSlt'iOkeOў,s2GY{ oNŴc 8^Q\G6{霤v2սfV Ѣ4'c)GN !e:IpBT( eSl*MjʦBTP6ʦBT( eSl*Mڹ}h';yh9FᅲʦBT( eSl*MuHdl*P6ʦBT( eSl*MP6ʦ/L͕GtrߺQ 9LΒ M51 $jܜ|p"r ,ӞU*?u, |?N%IYTJT2RL%+Qzr%sG) TE/~)9ӞcSX@K%XjDz-B83~qϙ:k4?~:"p$ܯ'!~ɾ eP:|NwYYޙ*bENKDX4GHI%EuxCErY#Tc[V[*u)ghRKfQ'g qkFthm]Ŧh-vE,_ȲZ>Ձ?Կhc~L @iDԎKS =!@Ҥ;c~EU962+c- d24 /H+BylT;\M!9L̀5n::  NPG!."(0Gzr ;[[1|MB B[-D$&%#(%X\iPmB hE)>*!J&(.SxGqN'QNLmVC_Cؓ`pF*@[1:jP֪(CvDvGD>i'QeMFi .j Fi"*A^EWrNUE\$h,Bq|o  *(r8 ;k)ҲߗlgQ1g?W!~ig0j-P5qfp1N$oow h~ƹgm}&9[x??RLڻPQO-jr#rZ^s%Eɸ{<[¼Z@bKShxkbJez0 s(x[][3{_.h bӈwmmwQ^Q;X'o3#n&oe1%8 %YN!=MY3=,M}bh OHt6#Dq0| :w{V83@ogl윩|3ܶh_ڛOq~Sdf& :iCZzH5:X8v\4u/ '8{S} bפeӮ0\^_^ap\{[g) 0rӷ&7^?i(ҙjI,n]ARq?K e>"D.gm  W7O+mJT뤂dN:a!4TT͂ =|&ή'”ћLS=78U NՠsnD̟)b~6,#WP+Zd` NfҜӣ6~p#^V~= XeѳL,em%O.p'T–O4 x,_E B1J$KЦA` V5CD,¿$* 7s$,ķ|NzcE)8惔Qk"R@վW5 4P N8 pA]WFU Fm.:~m`0ySj_p_zt۷MS=dqo&Uo*JP}*f㤢(j=a_Wz]׮z$^U.ŏW od<uήD\9czÏ::ڞ}WmwIz{wY:U 9gME&naD%yl=@V?_j]u1xg55@<0G'48ጌy?8p6ƧߎxUx{4ScVWq=/Թ1\qlly5hbbwz\p6~CR7(0Bmc1}!G`p)N> \ttf";pl"fZ <уdpw%l=_}:مTum]YioEs+:0sT ,{1hX?]#KV= Qb>'%Y!1BJeZDP 0HA3^ p w14vM}Ƣ%KVx̓V#ൄ؎hɃ@ GA H>:>ގiA ]>l}9pT& /Ώim2M"Pe= )Wǚ"XRDſIuj!h 3r\%2;"+:r܀a|,'\WLf9-N|^lp$.{%,F֠5%vAM$`ͨ+Q7xh$ZFJ̈\6c"xUyN O8Bٯ+ Z3c$!\97S SG'pN",MSWcb|}uH;^7(=<_|=T{ 7}^0Cvz_/,itt8yAxFFyePvjɝ yH&W;``{75Շ첺qկUv{fEfhN< *55h$9gkۦOŢo,. uUv*{fCdom+;!w>Z3b QOlBν>Sxۣ غszzw:G͞Nmpv-klYwzx^Wwޝ\>絖!ooܛ՘(ҳ:<=w˭ZqOȸ5iFK\ t-U:6uaVVl5TnߏִyY cvKrdަsoPI =X_8_C[*+$'f!d6%FèOIL$`HpU1Wuvz&J.nJ0sQ,uIuvQݠS|{TN_|5 x&eD+\#RnZFk rArWV'Y,m#GEȧt,v0w`eŀdl!l'+^,nKVh4(Vw5Sd 09i3+E$&FI<`5V!Y6R+]t2+[U(MB 8V,II90 Y6r2$)0\*C5q8}zJarw_:czzDery{yί:ӯL>1D$.abZ;z FXƝDbbL@J6O.` 3 kCX} !Re^w IM'%Cj%sԒ[m`TABCy4@]q4fMT('cKD8DȡVWgA{s Z_M{XGJ*6CG P>Dj{v_UBVK5@(ۗ˸WRHr"䠹 R%!i6B(R2@ގcJW{O[%;mwgϡD5)/FDǜs(U]w2˟aSB~ݽŪI[v>}}k/3,CsyyI$t9}mjSj`O/. 6^fyfK%Z*YoDzB?8S)f<[dI'VaXZE'OOO4l~,'OzUU]l<}#P?gs qq챓[t]R&ˮkl9t~6_a" =X@g̍f+' LgYۑ<ߐûktJ_f8l>Zީm1 9]o <ގl+ ,3ig5^$-ǿ{ㅱ/\`2a:?zZ!{ϮO '>HlUHs@bt$)4Yf2WzăNe ǾQc%cX9vf9r+Y(oxn{=ymxYXT:zܞ,PȐ+<]Ҝ- '{/@ u=/B6Yxݭ^fuҲt̹4ysa=8Y 4)/5^M2YFxk}qT픳oZQ*cUosn;Rɉ4dF?Sfw>^ sqvqlh^ xQ?LrKގFۿAnQ~k^y*랂wOmZ_x6s2\xY[=-6 Vv`ZôʺGcZ$XVy|Us1:7NG0)ShN+.I&sJ7QGߠ"&%"9zUz#2yIڻN51CJijdiDJp3nY0p:\{48BMz* ee۲!f9l [&;TgXT[v^(J]Lj4 `59⑨] ojs"Zgz*<Ȱ6ng Kf;Kfs'/lɬÔ()(Ty!Ӓ_T L.!d=o5LKF En4tX:, v^a9J⌛N Od \ \K@fD#FIR$}@Ѣ7jN%wGh鏰2𪉳? Bg2R$ԁk裈C'hBzA:ob> <,XOdy#~qXg,Ip`%k p^8"EXaGRuF^pj0[T>V@P)Kk9`Y21,gݧt8,;IҊzN><͘@CR{#QK1.2StU5+VS{C4}ϚOH!8AW/m~.P1.$ },:ga@Ud4"JqPrAГKN[(K28Dr2nd\R8WfKO^Rr@Wwd.إzu.'_Єdy2_r&rZ. PH 1HĎRy'Q0g/`2F!Kc&Q`1WJ%joug?b9l}AjXQ=0{g! w\$^ QKLB uQ\L̰, 5-a d%CjIE_ĨX":*a5qaԯ ` "֚G"fPl D<ހ"*?ڨQcNL3n!D\ qϳs$U;NޠUg$E0 tL&3˥DPVU*i8YW R}Z6KE^.޻i{Ie;%"rIeYv674BqD{Skv2Yw lO}EF] |/8qFF;7D?>S#գ[SBt<'NȵI(b,̠ :chTAƀG <8{xGxǣ{xGݣxG݋x#2'T,5lEV+I{F^[B)Ɍ֓PRW⃋<\'#< TP+s&`,X@g?^rq\Ŋm<69<|(mMmyjy:MDիuHW:太Ww#W%*Rc5poU7KIDֲ& &4f Cd7wG;1bwsKi %8TVB\K>qsuQ!tI 4#%X/ )A3$I|e"[Mwf1b6Gy'dOȆitZ,kX?ݧ' ]@XBIm,2KZ1Jgd +oLt^Zռ,Y+PSdM]TΏnt2jƙ+MtFm,\Ng';!k7] %~M)Өy1&K!) xNx{;q|ԑDqꎔ{T9ձq\Y2#2LHNI@1&!ɍBHNV| ?5ln7f9nII/ϚyP,n,t $@Oe9""$1+o9qT跾1e;nݺ`]{p钒 V"klB%J`F"46X(<*r0ɿbQS>7JxTG]xT b u!ʔUno9ɊTcΘ5T:J蘍6gR<'0h\2]y!snV\џT;۷8T" jxP'd_Ƙ;K=A?Oa?-K-\!Yl*Fq}Cq @3K 1@n=S&HxQx1hn5]m3> 'FOݣR17ةIӗ!sBb,+SݢaVHA5SeeKj/ 9jC댉Yee=&΁zIg'&*~ ]:DTPdSLf> Qb"L6q'.Q`bHeT \p?AJGr (%S&w;W ΌrhSxM :ځjգZr2IV~oy!hC)"$!NмNpY.6'^vٚgyjmBb4f҆I~fkg9s4𝯈lWܔ \zѽ)]m|B]Prʲx#4ĔGmgg*#lQO=dW:͐cV=.^͋aMa]mNiSUѩW| HMjM[$7&˰`.'V5 EOy>'@o_p,K)?yO%B1}t6~?iڈ7,o~Xw.(,//ТSJ]L:bÛHұ5tlXs_|ǖ_<-[EU7W| F"`jJkEOގh6e]z1^.FKouyՓ3MPJϭi,-,~}'&$:$B&t} (aD vyqucc& 2_͈s/M–Fio}nRbJ&/.w]pgrt:3Zܸ+pKL[abۿ[*VJږ=Z<(\T\0]j-9-{4FE@zA@r#m=!@8n|X癇uZ=/s%/qBK3Y|忏_>a&RBХu7+kor O ~8/'1 Q dy{Jr%GG\`6Uo-Y,Z.!IBfmB mCY,ECJZMH+wC͵hlwg]:vZv)3cZtKu#ٻwSH!_U딣V8gVeQu\%U^Baz9 גp hLfjGdTxIԚcJ*|M1flhgV^{}S@ypvcmTn!Z9h&&Lc*DO0O@cVire ltb~\=c[o &< aD{#p\l2sօE-p`#<"JyQdğ k&6Ī!Rq!*1ڬJX5Az`<9`Cnx<qwE)9/Z|0 50dGizLc Q )HjTRC|R(.&l!gY  $SNl }f^C6fn3h e.Q{3^EC=\\;dWmS,XˈwhlZ^,W؊"DVx>W޶(BNa~;?" ^1 bS!;B(0RXFՊTCg=k U4~ aQ0'#j hNia( 3725HkԬU gF(x5byPbҧ D2)I1e y~5(׶v^%xYk}(ѫZFD (ǔp((-t5E (+hX4BS"p#FD>XJx>j9ι9mFِCL*V/:L/c;0' 5EL BECd](ISlI҈56mk"F+Neg,"B= R G^Auy7fW[ġ ɲs!֗:H[Z ,nn/j;)tTx}'$!xPlvcR '#sGD`ѐ@\{4$P֛NJ"FHX @D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D}$ә@`kTi$ٓ@5D}$D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D @ % ۣ!`Ϟ 5@FH ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "\ c"`C\2Aٓ@m 5@O"H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""!>Zz+~99y5m//ڝJs'iv=m1/ۍfAU&>zd9>~|fzr=/O,Ƭ$LˇoN7 ?[y;>I.n6<|lZKڮDYd| 1̋ײW.]iJZu9J̷Go3D^q%AA|49 sh\D8cJ(%aޫ[8O-s9wY}Z~/%khe6s!9] Wl_>Fh:]xuy_c˕la)X6ʚQ6Vq#Y&bO!za5+420Me! bq+-rwd%#~2$;WIZ|5AÉrg,R/9WRl.fo%L9p9o7~:m{}sz[<Ëg0u~vۇ]]꯹{Nb:M f#ufj^◟.%6W>ʶ _/~Ǻlv?m1t+)ܤRnAt9A9D=J"3m3m fJϓqrmx? n߁@#?bu6{Ł˼3uu@zO7@_~B}zLCmHHh|nx7{onFuA3~:%b {V$MϻpQ. b:i,a\O{AC@I $45] Qt$L?™gO?~l{=s~䏒>:`Ǜ+1#BX^op e-/^ɭo[j?7xc!r^0]p`ZY*m.cbNsk#TZl6[ml1fٜ6m\m,f|ގB[/\m|֐;lM|s)4Agsaow?۵]]{5wtoû9 >?)|%WHwK'˞}w=;dY,k]0xqn9eOi VRX4xzgtxu߻R?ǻR ۲Qn~yTCŎ'N{;d𸯞|&W7P~7Vjf#iwܒѪ#:҂^z:>#dV:musqwWc t^nB\>rҟጸLO"? #vC''7N :CoZ4nytvI8K#KvM_o[ֺlIi}&4X;]J@IZz~ y~t{=gFqɖ]'/cE-u_}IǰQV{Wmz۟.eV*x^dY9]+תvFK5?{׶HdE觙]dKٝF_zݗ. Hi <\7JɒiNd2#aFǍ2/- dM~|?}M:y g1k}]#-We}+R>P:ǨCX=~?L䣒>l'$E'r`tJiSB*CN3IGT{E8d # 9KʉlqRz=Y-RB,Զ/mmm.ۏkȽd\_ր8V.F8x Rg\ܿuDC5R+N:$*5hŨdboEFR!"eX(O@]rgW3U8Up:v7犃=,qgslKv;!ߐq)_o8OJ7j~˫89h4Oo/?)G+JV8-<˿ 4q>iml; 5_I3-Mǭ=ÌMhSg\CNHZ~ؓj瘅Ҧ}&_WnQNaMDa&MNfӪy0Qԃf:rOCnE%wnl6)C2,;aY\v&fۖZk7r/kb_U*y](z;t{ CCǞpK:vKS|y'Z'k7Nw{hzS'֞[+l]n]mZsz]ә7?=uز–Ucݻ=5zmpM=]\M?3:.SxgpQKR RCH8ӄrڑZgD<& u^"Ĺ^s)uL'HpBKN5ڑy O.̲_I!p\0r~#LQ';$pە  =4Gn6ħ  it ;{0Ԝ}"ҋ=^o%&9(Lt7]9=D'Z|v:@6%_#G(YGhwuKlXiv:u)8Ndl'` /u-NB:mq0M 4= |zQp,_#5Cr?l]8w5suCjm)"-)'C׮3l3|ژIɔ}ѶSv[eOAwj43]9s4.;䥢E ^Ho({"KL>IQGX ,{j-6n8_ ,BB tc 鈖!%Yx22mhK' ~(I2qG("ˀ^.U\\?[)f[ۤ~ٽ|Ou G0=KU元ǪBZs}?ǐ~h2g`fޔK]U_?@ϯ8%NWsS9ʅ1EL;t#:SrOIZ$"&E͢A+PkNȵFO(HZ˓c)IiuZ/e{}𦨣FArFU$Lߒ :h£h%7FP%x pvXɮHKnE!bvW*Ub/RrPO 1ny W.+M ]Pň7֣f!DAQdTF'=g)c qzb0H .I %SEwJTid,6W),؍X=f̸eseҜo'ӠrlZqFa)bJ(d}YAȤDJE[nx8 Y9`{xtȧ ̂NmV Fbyb{s(]:-vDxn.)HJ,OPhsw3*$ XeTڡTBFf*b" EfBDdT$ڹxXl8a-Q'"[AD$3ʴ Q@$²dD$!ji4FS("ZCQ j8Jѷ2xʅBLi*B>*0"È*m:YI_\q8}6oG<-F$e*%"f#Zs׈*D q,8w?/x3 VWMs*n~|GI4-Jtrq9]!'MpM_w rֻJD+븭ؔFexxG5x^;oPDc>Ä2"xȊ@{iZc<&|WO6,KM~z2@ D=i@|RC F9ᔕDP<&ХӥpC j!9tF#Zn鈧a^.WERvizH`j{@bFB|UIscB‚VJPiX*@dZ f+A9G"I@ ?YʆMd)*~i"k,gH1 s:Na&n}H( $TĬCf'{5'{ ÷} p'"WckF/FqQǛўP$!S;-1xEiDA}([בfH-hȑj’`B7Ob6JȉNG?_Cr3ۿxo>愡ȟp @pI!*HT : Nxߒ4:@QV8B2FSe=1ѻ9Bg| DS3 EjZa=uҨv;췍E@Ba&\cx@kpDfe 0peEcPo۸3NJ-E2P3e8?1] ͐LRbВt>d.IfÌz05̋H_So)=PEQx(A4BM 0A) 1#P cGuI1 wqffźIU75@8OM$r6 ay N5rS< g*U|2"jH% WEV NZ* {wprL?z-ž ׼inAtheU16P׀]}h_m#ĺǴ_N~w.! u4q`Wւ"8A/=ƷR6RFX }}'¿nfPR&_|vyD&_$8=ͳ˼c߱e,Ng8ڶض+_ {%b;ڣ]B,H]ґRbZR|)ꙅ '47HbeVa;In|Hȇ]aQ,biF8)Λf,YnI# [us*SU!R{"W/0)m'[o>GZVI0})w)#ߡH~'2"U.0ښBI[OfXUQyT3@zsq&RroB 1V}-1(IGD{ - ԡ ֝B+3.Qmm-v=#o;f;I+ߚƤ'U^"U1B'ñvWƣ:kN=*-ddzTĴs ^(y@>7|`rECѮZC)f !6B,&V7NēnޣrpV9֓E..=6v5*T,fnXTeب GQPBZ$:XU,;.K]qO3z'[+( ߎ7^-mt;a'55w]o^PHvUsD[yj8|VĜY֤:Xp88>9^4fw R\sOgVj|x v%k0(IlBO@jJ$i&z^֐5{-! KpԮ&L*kc$MXR_U X {CTuv ވ^Bn 3_{531'ѳ|"R/{+kZuֿU5O)dbcgQP;V^I&؈L&+E` L:9d/{ *嬉밻0}bPn A^5 #[#Vxbx`樲zPݹ㷞M_\gca`{&F89튉 bT".:[Gb*%y]]2S,7$r`]^a*g LePjKb#(B5 ߾t\} T+\&eA{U pT+zG(8{>Z)sGYZs)ISc@Iq£?iSGA;i.: k  J5D[HPqTa>9.`LYk06XN~fv9@vo)*s\swrah}Ô=g[0 Gw|/\~5`2k[NsA4e˂=x|:mWJw\]w9CqmV'C0E&{_>΋]/ŔeJtu}s hHѦi!OW{HnGWX5_`0aQ:c~tvp&lrMHJ-'Ef֨M@cI[3,逛vӝGhVkiߪq\ Fޱ{i!{4tqov*6h~ynBr(h2NTTI\QlYڔL%[tϿϔ,ͅm|yI9=5C4-w}2G3{⠣Pg=X;m۶ jT7P{Y$戎E>ڞۙ=|7ٶSwϔӆi$4uvxbtE>lh|IHLWg@ʨE+Xb-S5[4dp2;|lVC>XאT#(:k@-p EkhrC|jԝʰ:L !U}6kxeu\15<ЊQ//zЩe:M-{G$sM&}p4kf5bc2egB.F"/C*A3b؈Æ^3Y^RgʓtvXU,'Vp7"_.\DAs#_|D7:VƭKl_k>gp}ڭV=,i4f7^ m-,TO }@ *Aàlt,5;;REBuMY)hmmL>&. Ǫ`;f'N( b:(rS۲KV$в`C*D5BT]ցRtt=f/WXתJmui܂Rl1삳sI+iQvHN](q~]qynZm^f{re-z ZA@cߊ"YƅռQvw՗bF]{Չ|-6ՅtLTZPJkk¬kV."x4RQtDms9)qbon91)e |h7RBLU5@Z>&t5 *}ªu5C;9U+`\(5Q=Hg̏ ($.3>׊TrgMZ 2r /cIPIY}aMU2U`C*1rZJofMU+ ȮEghܧ虑{"b\ڴNg~@0wE{{:DuH}$lSAꓷ΢+h?M6G+ۤ@s4DuaV&9:Q`YeT m4☽4a9-y[44n,>|v|~T˕=:ڄǫr-WPz`wVH\t+TssXJ ck0J[1[olI51YОO&X-.[R$zi_jqJ=8Ѿ,DT"ÈFt,?mе* ĪQQJ2L<.ϘTEm{trZP;]b s3y jJ YJJ%3>?5 xTWQqNrl\!Hx ZEWŊgcR2f_] R`!X gO[_bL`JEFCr6 F[CW;6D}9-ԥ/y6TjF{,F3a)m#p]cMQYO& d](ADf ;5Πu&Iɷ2D6t }UPѐ)䖟iP=k;3L_kZ%/j^ZU&ާbE3 ֝;d_*w%ے⋊ bRgXQF&&Qԣ ^/';O,23WĠ}L Kj6p[ U+J$E(v)Pk8ŧ4IfEqжW!zDJ`E)6rKB˖Ƣo^_E Ӹ@56At; }9&Gyzӣdrvx$:T+D*r qĺVMNpP''e|ԙ W_K+'JI,V[urM5%$ؕKW>yUp@sL)**Pe5@1d[ Ȩs<ǽ&{ZnIOoꗃ?P]rhF̆!PuZ 9%Ua-)qkV. Vh@pb'ym1C.:8$C SKH8rQLr8 ,qh=%%ɓJJZR>s|ȏl(zmfME&O6yȇlOv/rˬevT"R=5"R֩`a#+Tɇ$Eu~)U暒&\ PLO۰X*KXHmG[Öq<2-M2j ȶlڽ5daWáE|֧ReP"]g~tAJ]--؇i]6-`t1낥Bo90vThQ٭w4[Th1TGZ ' #]6RvRrRTӸPcOP 12xha46_M1 [c豖KMEOvz|lWF7+ߡgm %&0K#dQKlD垝τWiWgTzͅg*qZzN:-_sNc5 uOg و @p,mw[]mƝ8fh j`cPRR(ڐv P{u7b6m^'o\Z$nH݈/%oK)2S4957SI9ҔCIIBɣMup0Qd3sFUcAЈ"ZQt$K`tLX0rV5ꚜ-ڶ T(sӎ+mg~Dunì~U?USe-;k04ip뿎4ﵼ7g% (o}`X݀ 4J$HhJXaRbFhb9VS ld\]@T,dV8-h8d-V!Dd&8qxIT.1m9$kkM#7;tvx\;c<QtR90]֘BP"uAbΩP`Rv$z+_uycMvnm9hOgˤ YlA լG΃MRDQb4 p> bq@,Ma_ANY)ccANBlĭcX^c9"6 F+f'EӼrMQ_|([\ jpW `b*0x>flU9"HOԂX9 H:,Np* j]T[Uyd7<oȂ-AhaʏݢLj8: eؤ/ӟo_v~~r:kg ^wpriy C+8G 8\.N{[eeeNkД U,lqo?wݚJ$k}k_OfCܫݤ<hxxGфat;fnͫZEw]߶Z\uN}וI7H -)uzpR H?t(ŢT|~OKbkjJ&TT9_^ARQڔLO'u?iYd6A8KiUx^9,`= i? AVCq֣U`=6p@hj@+2x"fog7mɑqMjRzl򇶨oU?MgM(C)Dq]l3NPAq:5(H;x8A{~&ʈnyNWuu-ge"S^"ؓ+=qkո NݪW2)A/~zꍛnY]d-()c~(^lxztԗL>\d:w9ϘC-E rEDQ (.rdOU٥4S04c&qaTЗU 5&&Cd+F\kA @3)A&F􍶜b'` z)3#n*B.CtNu +ȡPPX 5V9Cwd;:Zo|87ptRSӂ:j-]q:;Ywd6?~@q~z/ϺŤiB9?]9w{7εqJ׫ʇ=M x/OzL`N*z]? nW/vpR w?ፐ%f!M(Ynvp!/?VuGʢݼ˫ / E7Dfmx[g@鿟t&-`,~\ t6rl PX?5xԧo4=zWR0MC gBhUŖ 1Q~5W#Η^Ahɪ^p)nEɈwUt?u>;+- $W곎e-Yŏ :Me`K}Z4轴RgUI;8#e1AysEßyk8ߎo]ʉNXb v07ߨgg4ғ_M>jY $bucD0 ee!ΕRl.9^EmM1o]SA$O+ETmSM]6\Pޫ픷0Af2esQ82s%L]`竼ߖc X8Vb fLKz5n߼FC(JmNbV r+J%83$v|pN^RB盛?}Û\mtwkv ϔC >Onql`DK^c^Wz6N!jv/a+Rz{uOZ7k EDj ƠI<Ð2Imċx5!a[h '%tW[#:(m l%Lȧޏ &y9}Ze(GpeC 䕲6 PD~^Mv#+hXmDHD`M$xvdCk2Ecfͬ w|,;eyeçLP]uh=ے!VԨ|5{]*vĽYuL.粃 7 [UѶT& V.9Ѕ.0sG̭*qp4 7t7]}kFL}ho#];" T]4P⣧Hѹ\;+z؈'_JΊBD )VL6+J!QAM6q҉}P(jFySr}YgAL'"&F5-#nnmucpiz9pp/jݫ[5'?x9?oHFXtK 29:Lȑχ=;-J5ľ^YG%RcoAW%ށ/K#z1pd8q,`&YX{(\K`䈶ۚ 9}U1ąQ&.MmՈzjw!词8#.>~iL*fEǔ|DevYt|Q5kmA]mo#7+|Jm5@pw=, /_mȒ"ɞ8WlXղ,Sl$jbOUW3$qXS.{]thw35.C^C lտy%hʧe^YgB#>6^Tg pܛ abj : BI"<|pDNN9P93O3Sj&0"iN*.\)dw9-ŸkX93+g2Ǔbo# ڌe@ yY#D 矪AѴ3r΢Op u>EY+g:&TYVic/q҈P^&VxP 3R K\c IC\ToM//L"Pp5IJ2pDS12Džd1\Z@.'1@T*!gSGsN&<[: I:oi7S#Q]vd'Mq/_85@=W~VJЇjZ-.8% =aFtYYS:b {D_(m*3E߹,gWiڟ]iPϽR^t Vħw,/A ߕ=.WmʮGGyP}!KӾ7V `L{ 9`yRL)0Ǥ7'Z56^Urݒqq>q :~h`v#-h.+0j~зCb|+f ugohJtlN 4s R/m QRAܲ.uCV: o [@G+L}*?Fg+͜-ƻz_/7q'Bf g淯s+߽ϟ G_}(mA8!u0( ^ǯk=%^;h7ö"WBZ(gGCZ .QV[}=/H'DBBsK(TdBK$s^0]W[z XF$J! q@F QSR$BF**J;Y>Ϥ' cCeSwm[v{S ӹDD)Lt6F%38C!.xP!"e:Q&gsetJ;Vw왫xwO \L46==Jn|X e4C? ,&yF>6{{t◧>~|rMEkĆ Q\ZeilH,ɮdR* YfcZZ~BFso-ĥ;{6LMCZ6bma~jqψ>obvt=ץ&k&-@tF;q%kqdr"c9t}yWpƳpNI36k쳜ZL܂_׳͝9)I2|hlr;x JmƷ'hZ611loz;ū|S+ kԹk{/rҦ#Ԅtz:ŠǧUv GG7ѻNt5fw*٠vf;!lpdݵfL lF\==#hɸ>2"^CzuJ4/EHGE31/<}:/Q@}r+W#ڄ&u3`. 65Bx@m&96HxpRQkl#[wJs/玛M=8Bc1SQmJ&FO+`\$ƁHjИ[XlQJԕ Q,_qQKmY^9x:Q9sR"q"zd\ 2U pGDxB)4FE eBRDUpe~`5 0`{N-$咛E_G B9 `KIy5QaC)5!'d;ݡHoE#-x<{E9\f Kx ˥)!$5V[4T'jJx iQɪ)cxFeT,IBiIXk." ZSH:HCyjţF/^*aij/=/,޹R:\aHT$fTY 67{6:N )q8?DZSfb頌tY7Ͻ0,g_q3xc.03э wY-7Jբ)t?yA(w19{nֽu田2Y?MnJ*\l4IPmk4ypS6R^/-=J?}Cdt՛/g9Iub5neSt>rq}w*NyZiM9_C*hםho&Nj5/CwTvfv\ܟį9ns{crMͺd{>]z5y0 kn=ǡH̚-5[ T=A6Z3[BnTZl;[ :/(#CY..y|KW7$,~ۙ4˿F?; U.aJ\Go{< Ws=%хkY|Kɳx%-m%"WM5^pLؿgRs1YT2eWz`oYNd)VYs"JD2jH:C|?i}]TԳ( 1nOdqɌ\K~{E @jL|zbBezMVU0<ǯ]?XA!-K.d1B4XND7$d⠒Gn)ie^@.i]i5[)&mwh{A=ոVpmB:k9J&a-׺[39}-rx;9懿~_~^ߟ1|˟>}ޫ?㈣ױAxOvf J! &Q E@akx*HPB3}&)72Ԟ蕉2T&DSh*Džx[sAMe2T&DSh*MeL46*M6YhJVt,bCRsTTҔJRISiX WTd奲TRY^*Key,/78%Ww,/奲TRV2RY^*K5 奲TRY^*Key,/TRY^*Key,/奲TRY^*Key,/奲TRY^*Key,/奲TRY^*Key,/奲TRY^*Key,/']-a=Zc?Mngmݍ E+pxDh,8ۈbCduUcE.>h}G-'ӌ/%bp26Ǽ2rC;@1B\HAx#q4R੉c\4A\QVXO!n쓀Sraxpv5^B b_Ӈ4oU=~i`y1\>E@ Z7̼:xMKuYo;RBF3Jtզ A9G IXWKWsy܆6xNGm!:ˍ;Ô_ܣgZ_*Ϛc@u@?RHI.^eR}W]Mv=Yf̺p~v#8\#| $i1Iv[ԱN#:uq\KwԪa28Lp-3{,]rqtt@YP)$CJG]UZ3mӕu9ܗ@6SOs@}!&K7b5G ЀWBӔe֓ !ƉEAxlYx}=#Bz3 ;]x|lBK)BTGV2:zT[gNͣbp:Ak2h-딀Lx.V:S2J2%t:h->(+P)9q@0Y@vR8S2byezJ)AO_*/a[y5rؖ;ǧ^y}|(&ګ2>`MAN$XPY:{fV5D1 zvX.v&9'P )p2ᵁ繕q:xBΗm$Yg_SH뚰$05=WIAL&F)9 Yg^~Jj "I. l~(='넄1䙒Mv~ p6W|MtNuR>S:v1 Cgg IJ;=-:qԛr!Ժ\?Uvx.57z{ݕ8j~Y4t]B-6/mC^sޠGj574m|^|7l3i5o{]SUǾ=9{|z?3% |3p@l#Pw`rv-ntYgW:BgT< ny-R6jsNMXI[Zlh=GWI"GM)9WY2J#*cOd_ z]$ l%[D1V)1EQ\ X9VKe HLH|o] hEDdH#푱ɎY'T`y-d ݾc쳬k¾SW(>8V`ncwP$#|&q0|] rWIFjr}v@HMj "6gj\u~ms.*#d1^gEb {=Qʒeq2H9Y Z(u:@dM(0"(rԠbN!FZ"r*BM5x`9p,>bWҝ+rSEc G%eޫ2"EP,*Z)kEZ8/YOd *>6P2 $N*9R&GuLwn K?NjiUJ`rrh.7pCSJm>#c=eerwݙ_>dRE[\j(H"%{_2c~:Q`$iH>r"̓ZF\bh;EdS7lqR9PR8dl`t(l i8V&{g 7#϶xǮ}).-&< fr|b́5pheAUd׬j9WP"(u0xxUj*Fd5ʺdu:;+ =ioK"á:;ΦNy,V{08j`xLHXӜ9566"8CȂYbP{XuA#dF9NۚL"YIS6?13jra" i{0EJ>wbJqK3E u,DJp9$CHڵ!ꚓAGhihLeǺUL1&KZB t15-9TG]Rc:C" la]|y_+~RJ:j) V+&UW6[1j(eGQѬ %C9=Ï`Pv }^5w~3Nu\Qَ{$ُϓyd]}]ΦHtRQɦeYZ4[S4GeX.i曱0s VR*# \i7]9 Dh=~dM}5}5lٳZ!~Ͷ7.Sn| hÃҳ#?]db0jMpOѐ-NɸRGA)I##4s oP NLC )I/ L pNJ T%Pg} gEUjg_޹=W3E GkJ6כV^ʶĀ6j[#0 ,8c,\#jp׻SQQ912Zq1d)@1_Fi4.rʰ5ZmL 1>#Rd8"*0&S-#XFEd/ * gocM֟e[5;*NBhZe^DbE7!Y*d019k't]1m\wu=r5ĢlNYUv(N8tAMx@2UO=9>ł{F:"6.dTwuGI2G*uH0gxMDB}.GfgvFiTi+H($(!Hcc,)H*A)|RV΂T:*10R(_m%j 6d)+Ҙ=V+P+},y(0?Ф\ǧ]q.]eIii%y&Ux^-fViO Ž[S(M5Ț˫%X7UeQ3θ뜤cۣ,YGťHv5yh =8&Cl)dm16u:5c?:6j6_:iΔ g怉*v"0~E͐^!)v~KQ|DENF|ȖruC0 G@@jkdF(K:r j "8B)KcMHԗѩNIyUFMVkY9F]6u7.:M͋JҴ kt]"LgWfECBo7T5yt^Tw5wy͋_^ԑxѼ8]Fo(5pj6Hw^㩦]% rM .:6\ wz^CˆIIд Q6'ce6ޤXR='oYל]N]9-|Tiレ(532K5ݜWyL{쵘hC%p9N i{6g,ד&^_, u=Y^j9SWN^|i V#Гئf+n7'MaaIŰC~ٯOMb%D?p%0dػ 6-ZK<+c]y,Mw~E|$+=]SWJ]w'lXɖU,#W,_D_5SS6xonX'Qb{#ÏWԳjGdCۧ}Vlfl*P#$%X AJFPQza!)NiFp1q_(>-RZE`R>&Yal'l8qsd?xk &MK2es)JLY#paQY>1|Hr+NEeA̖‚P:">cf]J4¿AMXUښUt, ȱe!ۘr3#1<ĜJ[OʦZ!Sk٢ Qn>`-F'L C6 9r,jܗV *qow};w ?rn!UPHY TXP>H$PL&S̠.$t'jnӒRY#Mǐ"D4o]  ͫ^Step-iv¢gv84N)E@w{Fca*",R-1>M$E&7°(e%+oD8_#'BG U. WOǿ3OZF#A8JByE[pGGڤjvѵRړN?ZKO/0JREG yO* _ߺ*ATzfZ9T $KkZ/518̜r9k{w8jnٷDOl8K8#,h" I9Qh|AD/FM̾rwވᰁwݠ?BlZ֚vʄcK(فo<R%;tB޳`n?LjɔLy)At]Df5/f5XM-)蚫BgRΉ :W=@RR6VUhjڦcjUtk3M5ns1(,ct;/tMfjzN鎲&]a <:$k搗<<,o']<vn_hlً[ƩjjPZꇐN,*a߽zPOZbk |.Zgi9ɇ|U7fCiv\_am̗qꔻ/wI>_xUʿwl//.~)/&>{ ?1d<-5L9]{TY":'ѩV*~;pw~ o\kO;Gzf/MLax-<_KrN~m]Éxecqh X ;Ϸ{kJ#?^J\ع .4,r+Hī8[Kq^ѽ7kx վ5JBͭq낅na  r&۝߮*lo6|p~}vΎV.:fyt=WT+2ߧWqeqz|f*-ٯ[~2:sc8@.M,S.<+:"@~L뼷M(cs|O閫LvF!"pAutzd\t;Ndq|z#-ą=WM;P(#uG%B/qw?]A+_Y}c8Bz Kr{]Y➭[3Kۑź7 cGUugGHEqDrnszߡ[Fy:ݼ5iȝ牻Y)oIf#=sjk>3cKʄJZhʪt9En2k'݅$]v2!?l_67  bq{Dq!|ǹJb^筹&9Փ,BD]=6^䘬VM/HIXmJYm !|w|~ldhtRb](SZ*PhKyrThUiեrD[ݏuJVݝtwZkbB㦴ַZ B2V1J:ׂҘ&QR*S)j:ܨڠc[,OmėzԬ )7Vdiu͛h4 cDLµ4=v _Ʉ,ؕr mq}sQ Dg)b/9%|ÄLA*rE!՞΋v#Y /Ȓ1'B&4~ѪH)O[ Q$EU^܍S5:تD $Y1ڢDwIGoaMطJ0e>,QڎMѵ(%|'ԴIE[RHq:^vG-BE1k#U>)b"zҺXH>UF.\>E=uk,R rSȘgE¥:jtALڳRȮPBv&A եfQ#2L3RQ/3>t Nck\4ȔSk|FmxJ v (*6 :ـ-wX q9ԁn#Xe~R7@&zS_.J]$E˳ǚڈ.tt%@\AI!.tA44ka7Wg\C^- V5hyAj=9:TZ{"X*P*JvTZ2[0¤6]{cSO[5mB7JFˌ  hB Fsw4vv+u zZD[@B]Q[ڤxeYi)xWh%Wk1h 8Zom@ 1S;"fGܤnB?%17S m["80)8Й5TiD1nW9 i+XGeұK \VOAwf+%JE97;KQll;hՆ?5+mWebEM]Vkx5=)()H}Ju y=իf!ѿ&伖( BD jJcihTY\FoeXE񮀵Ck>84iȀgm g}gt ͘uQ401ƄULl^v')A"/ڄ{aV9 ?,gĮ]-zDZI$MY DG4xsHu@Ky@xc#PYiV2MpJ%V 0 ƍLj,(t_8g@Z PE"LԴjPyUA>xmBV58 U8n,U8|Oy }Y%@ VxQ ot`u~R$ 8ՀQh%᫈;+Qyn&k)' O7VE`9gwpC*Kt]H_>1&!GuSAʀv1JL.c &K'npZ{6pPJ"4)Y c~XTי4ILpb`*L BȲ$@3A6 ?Ft2%VmZb~ Mk4C:iϢ;a$D[v jTJwi[ y/GѠMJw a:J؀$`>OAK0A/lmRi6"B 3A.3\f.%س \u@`RtXyH@ɕ1YZq6 }Qǿ:ŜWɀ]Cӓ\a6{U>/"jm?O;P7vlݷL`t^&] WٻFn$mino.98`M2H9֍,y-yfWzXHÜqkFlŪoZ뫧 b{-/ p%Zh=_fŵ'{4y$oN'yuj7^-3L{0 JZ"oˏZ)zs9jЂՖ( &-Wf+\ g$dFk̸}B V~g?_~~=˟u kO0=q-Wj໹ZS%_ӷHj)B^==暶fP]Bݞ|A=x cpм2'4< P !rJx:MMeZd+XȧBZY'=*3\Fb-Q ޽RWLJl/v,?lSƇM q"\y}a^]]ɦwl ?hgS_t82dAk~]tbO]l'Y{6THa׿SA9})Cq3n= 4RMȇNnh'cUo6Nu$SAI׼Suz,嬧 TZda0{S;xC¿-:Ke_^)^&^xO&;y?n8 ]kA X+({UC[pXyK|ʖ5Q4)l':J~@+p?O~_LUavpO6qR}<]ǛI̫?@ꂋVmux~Ǐ4"aeɱlƕͯZt3l&T$i rsML1)qnAi97!1jm-U/f~0k}\к,΍!OyP䬹!QFw%|1CX|4k.: [m7mz;#:`vAg%>1.W|6?5F{VS{0_xI7ˋvcݔlpsfqdiz a!߲|z`+BM4pWuFf1X NlBImPecDŕTAF'LR BUFJ\m8xk+PfҎ5L|5zZNZĦ's|08Ɲqa8_jj}!O|gea{_o+l{U m<[Kٲ.$dYpWAos׫.ѱ-l*#LԠ4ϥ2dG)*ҨyKMB.:G7]&uG3:QgHݑcپPkT8ڲq,F8f锔 h$$( e2IVٞg~ӕu9lAͨrZP|qmsWبFZ/hG_yPDIY0%H4R!bc.^(4nqM*7|1ld"$O^ N/wZmG_OQ3v_3χ+zgoe^[wKtBxWt7rPQ46X(ඡM :GW,J3s]cTwQ5?J%R r{˙8H8:FYr8ƚ N jԽ g7#%j0s/Â#¥hCDh`xcak_GӆYmsn1O߹{%l):u3.򍢼rf3DAn=S&H<"BDMD4w^ApDѕ(#xct1Xf72}2,$ƨLUFzVm8{YW쒚 GO *^ U|'!h;+&t)$mwJ&'(0`* 58ޫ@UxPa_XT%C)eZ:5ĩ rYp(h{Uj٨I=OZ)o_y]rqN8}2O 뤩pT1z]VM&Nu F{1m*@I'Eh7CJ0< ͙CI6|SMn(z&"*|P=IY)1&^36BqijLDs_U. XdޫG=X&Mﺢ$X7AoXSxsGؤi4&_Cw{mK?__Ɠ-DH6{ 5f}vd^eˣ% 0{j vj LF[Դ+! A*ZŬZ hA!5P ɥb&نb5ʚxdR$IXcKc:t"O^/yUQ:gDUĊ6R2DƱKk uZJR$}Ȁ}QE: x9="ҙ(L@ gn!Z恅Rgw[vK Y;"Y;W2[A(t~ >z l{[ї< K5 0jp^\UɽKD()]XsfA}Wh]]H5.ew@Qc y8 +L]tBA*skDQRrp%Kd# Զ9.1Q@T\FH!Fϕ\m%/Pue^m8{Gydw/D*},Gcq6Zϑ"*c$А^ !,EKKI5QFcuCղb ԀSI]s r"*RKF ZReE6=bqސ GpsשjrrTDϏyF}%2M۾q@29k y'"Rd4d!jB!}58H~XVCl'a79 cZFl(12d"@Cݜ@;F2R2e0Vm geUjq-tmmgf*_l3Y펯Ml4 0M>WnXYVKA:y:('#F(a9# CR=ɢ1 QHAfSx+<؂܎9_\XZmd k+Vtk}e{#'%ޠ;nuH.". Q)[WPr8gRaɰJΪâ H5WȐr,:l@fɈ,\2hKH Tn{ؒe2HS\".-boJ2Fsbq GEF0(<;(uqdV!7ZIi`,ٙ19M6Ly?T`JT E6eƥhBYMJ]]⓻edq$)4%L*Kz^5x(],v3~jIDZPT8{L*(lK¥,_>rE ~‘06Xt >xe'Iḩ_u%.GRh|!%fh:chTAƀ/%Xd)t;ޑ;;N; sBEJ`C(mMQ#zF^[B)\XbWO^yN>q(4_3 %i2YjZ vKAG&]]/1|%\tw(6IK#̷77S<^#OK«1եRE՟ȥ)1FMT7ĘFITV:QZ] D@Y"G "^4~C9JhB ,|bHLSI!)j̺yZJÅ;0XH))ЯW9 2ZaE?']n[/˿I$X{ZH"Qק&ϐ;!w}rc֚2龝+-0^I:tQ>(M-e%) 0BjANC˷ڟ&;n'곾aO=9B L2(EaYiYPxfO`-en>@.q6P]qtY+m$I h>®= v0vcyIܦHZcwd)H](u=t7̊ʈCeFЀ{ӑR 9EGBUQ;)w 혩Hl8!$LhJbϯ-EXU'ES?|hLtvฮ&ђEOW)"+7/ZbQIVŵ",(i=u/fv3}2Yj֎ bI)a<#*h(~?gvO?G0h9/7VԘ[iԉi_j3<wgm#wߨgϢj F~4nh{ϒ Mlj!qR2A9fE=>LAl/"t;'k oIr B8F:e KQ"Hp4On&I6S]#ǜaZaL¯Cd1e1@jpVRWXT9& ((jd"V&PEOMKe]+7|-hkO}۰{ ;4!ٰ;^Ӆ64^n+=P ` SYuuo+@l%7Q-EJ*EuIhC( B$gCE6ڣ6ִCm1)#n"g !4 :X "5B(zf8ƴ C& 8rRƂI&.]?Bxupɴ]c5q 3&b\IA gW}JBJR 2 A8nFxg vr e8w1|Mpe&4w\ iLPSH8Q-)dlA˛F?-/ (sTѹD6A-Rt>Cv31:Ĝ FKYi~QJr֗R^MS4 p`r2:"K N )SeP@JTk:J<8/m;'NLAeGpF fTJĘʓ:oZw\z"\0H(OY7/<"`!8ڮޖ8 ~b)_0mh[bz ~8aۥо*m_2 $ߴWmL: űCGSž o˻QLRp! Q^:ΠT0GoG .D"Ip9K2rk8:o`x`*Z i}%##EB1@#9YjFѤ3%A"B"5!ZGM)rQn; "hct7a85Th|Ҙb/WA$FΙ|JY=*5$/訷:Ad 5򶢼stΚmMlt5lJ V׸F5~}x3RMn%g~y8#8Wxҿ g$Z}(YtW?y9*Fl?UIsS,(ExP#KIEV̵KxKerA.HN(D.D+=Rr`@fLz*ym'VjKg=F7f>w@>Ŭ{_yYJ" ?}\Q<*T0qR o2h [U2.J$q&~|2beE@N0V)4]oHoH`Š%ހs [$)]bIPUBHψZcO9V6ޚ8S+7>/+=,xCtR}¡I-ߓ{`}^ (NmTe"Tnvj#9 sJƩ( DJ8Ũ#!`IHym@+;|9?7U}z]xUx}j>t9NU?qPV)#t_s.[M¢Q|.^?V ƴ̱15<T0,SdcSF,7'DpH.)WY*Cx%ygYlx`v-,h\Wra %'<p5!׽gfuAqf]iMU۸SBDJι+XUZgBtL[$n.Ѩum"-u Zm^3xOv02CPXhO_nٲ)t;w8,./"x(}[ͬ mӫ~ y(ݶOA8!m0h=n/aW5vH땐fV/$H  eէg_UXH^H[^%SF-#%ΖI`NGӡw6'֛\<[ZKDϥ R(U=&A:"`%TIn<k6g;Y>פߜΜeel3_/׮ *;`@=ITq ܵK-hѶɨJ!*N(a-St pa#ei?2gJ!̢aGF DyL(bɂ!SDbF!F%38Cm ]PN!"e`u()$ܐtJ;vi9ӎD|LV3sB JA u ((%  5)lqfWiOG?Џ_ok,<q?,zդWMb&q`yqA N{Zv|5B,Eb!%KO@K~V AoFwkr~۟I)>"q*~*2y{ ΩB02A|Pծɽm=/XCOb08 E Pb2 BͼLbRB|82:G(yTУ 5ˢ~sC?Z5NC(w;]t6!0z5;߻~N7=a|LHxo8?~]{UjaR]?_PMM+S7NɆ:o)Z5Bխcns ۰|[s >.uu9k{aeaZ=uزĖ|γ==Ww|ȜG|X G>鞎=O%Yye1SshH+5O%߫Aw8%'?0JLG״δce]ߦ׭:(K!BxL2FУGRn"Vgv|>B~ɣ6oj NThB'e\$9qK}$hzИӚvYtI'E2(~;n>ܯӁ;j}omV(9sRveD+D.,04N(h0O8s Z"Xeʻ,I,BOOu3 /S([ibD98A%B\*cXJNt:?wN jYM˯B$`GWl;~>YLWo2߬q U\<B4( 5u+x|.{ז!3<dz:hz:,IBi5@hF^Z$G nÅl /aijmvqg܌s)JAIX9)wHR1g4H0x;ŒB{:*^|yxZ5C;z+\n~)]a0p Ǩ|Kƨ%eҔr8-~2ÓC0br,r֓ucve?MrJ*}: :^f~&1 bn┮t$k}#EnU] YԉEָnMQo}|^#y'U/~yrlJo MOa6:Բ_ks'523Ucs؆5Nn%;M˦/lrʫY}4`Vt`0 قgL{84Wi ļc{wxxeCS"DusIfQd]\V@)FV>Y?eӴˏ?gU3Z kW<6׿"arw1;yct=֥{6_},twð~m6@xm? >k\ɚdji'θ*;}3b ZyJW݉g{2T$2BtU(=ET!CI%Qn!,r.Iy{-p|f;oHFgOf'>ЁYTFR0ey.z$A͊UzSM)m'5LC,gFKۣƕm.rvG|֟5>.|ٲ[mV(5&zczO2Zm$:_nt|cNh:zV{n׫n _ V i/k7LLj#%Ӷװb۲/~r#x|zֹ/0swo;ZϞ b{| r[fe%($>:B9[=;}B/Q_<_ bw,xf\\*XX 6yv:9X̖fɷ?W.huZ; -;ռWGmL=[E6YlN` dTvE juD5jvZH#{2t4nqp-`4lS3P I4Bv7; [}_쌖׾xP+Sz8sC*}kkr'L¯pHV-Y 3#y@ris5{?[tyͻUYЛ]~cf{s[~\Sڍ,_5KUBZ eF}E&vCXx'qăTxʹ~e]=O˗7) kf|8Y"F1hhv'\~ۘBxbjWha,vy`Z*ȳn`G~zCTɲ yHRrBc)K7(Eeلn᚜ٿ*8N1X[E:Tu4F2KjrتY;ZeQu\%U^BiЮǜkI{rcMFc2cÛ5G(:URk!0c5Lǣ9Yy%뫥̳CKc!\;djy=f @cb- FQ'0%@ "mQ \5viEpQLBi4HT e@P1݌d0p0",0 E*C A1֗gZ~C L;mtԞEwiY MxQ U]zˬ E*:h,W039l@E>&-K X0sɺ6Z4fhۜiۑ esML@Y"Ak.nj!r`ns`*h >;5,FhZC^S@q8RV  j31G0i@ϟ|Tk>spPaRD,THs1#J -ENJdt A(O+lE^H"\xmcF\$\1v)E^gj &z Ai,:27iâHYa=+NFМ"5R&԰RkW 5jQT/ Q˃>T WMIAp-c|gApڬڅ7FPٗ95*m@ Q'Ẍ!,+QZkP!VѮ iDF:|G |slni3φbU~f"&|٥# ""!2 .Xr준)`$diPҪ5]]w*;c70&`j?®zogu~7;aY,uX_:o+ m!h%*xy|u&?fVmISJw˶p5 K_j,%d&.^ $QڗC\_ hس'ZD}$VD D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D @#B$HE/VOi k$H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "b2̄DYr@0WCZ= +-@_% $SD D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D @̀D |H Ջ!1V "J%D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D @7ַKBnYjZo/o>vRr~=!UڹiC3Oq2;Y~DŲ9 7dNӣ_rgu{5~ͥw?/[jPu!A6\9u4 c7X醼e~&z͛58m?C = o4{jJf/Omxp!9ϬjBkknZ (U8Z%$&< rA]輿 WL,L'65Kxʑ,|}kiZt,,oFTMb.fDz NIʿ&nؿp<ԭ6?wNGj>OA ;h6,3<6cI:|r! k {|8m8t9&cCñ_,RtZخ|qvZЮoK o_&d|ˊ-Zk^MN:m3(z#QS]QrHb]Mw4ʚ+CFy΀*5I.:!ڞ FGU$=l8wnPIy3mG-b[zfT!׮$:ld#;E3e᷈NIQk%D7V-%|e6ΘޭPjq](-=n!'(c*\#Pwi[I-H3|`8?mK=Ҍ/~ꇙշNM8,ɝ>%S(23 ("YspoO?r'~ͬR:9_@K t}=kqАףǮṪ?<ݥGU߾?jʿOrڷeV[F^_>:|nnmE׳/F6[67?mxɭCDZ=SjB=NV8 _z1gۍY/7} jzҾܔ+F47,jzo=O+Z|ꎗZ+x7uګz#-Fi*1_x F]2-`-Je>uv*[SR$cU[k4O0b2X)}0+['UXw]K9ӱ|:>xl[#{'PPolv `81<ɨ*!9b@(a҈`rY?zg*"yI+:0k-灿 i@[R$F-, &Cth "VY3*J8HCZ YQkO 1YBTZ(x'"uNUs*L4={c8nSdb_ A45̧~:/'?ojXBnaST:lUmVˋ2T>:[^Ǚ_dtq>VnnťSV%YU NK!,fqwWstɯfb?NmO?. d>{f_}Dtv| n=Ʉ.k: ˍPC}ttzlG6ДGw5QMi=li6ۮVegq"W^1Z#ˌ>cfs{-ߪ, bV!*i~٪Zv2{Lmس\`11<>9u;{'Z [͚SY?d-5T8vq;g&ݏ鞪ӻYO#+Y5SdO&54=,vw<-ǜgoYYUާkU,'r|9fouা=C []v-ZkCõ0Ray8ԡA:ii| VX{B;ߢZz-:x/(2znQ/E-~,J͠S&;㯭v9sB蹔))OE'r8A4:%`\qϥdTu_D:C^B/.>^\RK>N~sQ>rG? ?55^P=MLN}s:!%cTɺL~WWz|]!l Y]m=(OWa6Gq'orukR:wX] $\օ[-δlYOepc;&DKܬkJMˮO hm_NuhPc Ż9Թ?raF-fԐ/p:pfvtD sw8qZ<nRM' s>IT'#Q2zCVq}iRq18pnv:ד\ۺ2V3@ŅU3u?2r,1֪|)ZWfu'* -pgtw,Vq[))9cJ8VN8>k}opS̵O 'bOYZF@{iR vZc<&O,ĵ1 Ԙ`3F`5K@I<@r)+x* q-Op{-dڟy泾A:1ڗO#eW\_X^-cq"Uf[ bFNxU+jZ\y"T)Vb5Zkl%(d5<AȾ l@ҡYdZKy !\(FXm BsJu܌FE݈!M/+p2sN xK}{d\n<a-Tu6+DUfDJm6svhj a|/%Q52#{.sGǵP-"-]4_5V69_EN$}l};]AN74pE1\KԪy48LD%72O>0fR*Ab {4tԅːvf[Yf߶ReuzPF`,<χn)j y 8ʏ+iJ2H҄ȀEpPxh`r,/gŖӓjmHlfI'5j ԇST$60Y>J!*HD &P8^%Y5!:@Q^=;?1*뉉%Ή:`TDd$%Bc(^P h'vҞy}\fY d+Dfc"[H CՉA**C̀(8AolG,uH@qϸvVgΐLRbtl,#k,cpE|H\So҃=(Vdʇ@m ,aҘ 'Q٣{X_(\ЅCN*+fM٧ᐪ/Cģ@"`w 6ZvDX7_L9UT^JFP`,PGQ 1! |驆Ȝ7Ja"}8~̠>NM|qYݣXCqk9}^miw`c3kgt{v=nY~ɠ+==A`Eu#jt[ا*!;A*afBE!W۪pɍ*Hyڔ&$0IW9ZʢXYNr@JMs(nӕT8(رQ珆:풆T":F!$fIRbG7FZ,iNȵ'OѶs@:Fb ryoDN'.ygҫ\u[N'E_ҝˣɓRq>HZ˓c.JIrMS B<*ZEAe*Ɇ(Y c(<2f\rcRĄŒZl9=INa)g?As;u )_ЎmK;eir]w 6gvQonQܣ'ֆkoKED}Ou՜#v+Q7o=Gm?Dh=310.!XiJLnM^phcI0!i(R!32YtTqIYD 0ZĈJD&QEzk# 50LD>/.#"*1D!"lg XrFHCf0UK̈IF£d64sl6L%ǾULH!6[T+4vB}j9Gď_u8kεKZgoVr߸({rC\Qݗl(v jA AZ::^5x C\|3s?}Y}!~BXmUggy>;iq[xEpgZU%r[c ҁVţIeYZ4ZS43rBTڰ r|3zZHFcJuqd+EZe䒪f2)J,AJC= [oj-чe#?dr6zcjt~t҃#N@`-NqA~d2Ճ Ti{&DB%d FI:`UNU{qPJ{2mI߽Xݳ b`,4Fޙe?*pB31V6h[#0 =ʱz \%dTPN{PF+,C&$CZɒ `4PJb7G Q8gY*c22e6]`&K"=[sH+m"Lly-H<|R,D F.jދkC`=EȺ$MΚ->C16Ngu֕a{sE UvNGWJM&ztRrSe&vlഽ=N4S3(6܃<-^|:gw1 )0ٳ^Tw 'oL>&s8"`bq;pkXIv"w<# %xL)2f @GבS*s )|F^*-"o^qъz+_+ |% fG68U9~YqƳ寛Yvvt0"ϹlRMW/ /;=;OꕭwA/??wP\Is A]nh_y3?>Y(uZu6qƭ/~/=6=Z~IXI{Շ7t@}Ěl}Zw+rW Gu~s:fU_|yi|Qz[k= o:g2.ⴐzӝWHf'UոH!wRR=q<o߭^4 JrwTT8̛k@us?Q||oS{7,aBm:c/M5rwFiVhoh2ѬZZ2'/?_|4/t֣SuM|p;{L'l54vcz{X70 /\y fok𭾾CT~ԙA']dHB P гAp`P#;Scx%EȎL`x.'RT4 )Y 2&)Q;@m $4Ⱦ^՜VX$ɘ42;Q$dYyUFMVV]kY:nꮽ7fMyIB*ܴj_fM?sONVL]#2n}/R6iF94~םWͅ+kry^׵%_7W?slVtd2:gSM\4.Ǜ6]xy{gt Ajo?]J[@~xW$Kx͞aZu;Yq@w_5yf#Te.(1k&$~\F?;5pk O Ǔq{0g,oG@]Oc֧|Px0F IĖM@fn7Zcϡ_Xb10<ا%K\/)<*ԹͿ~:K<;EIܖtޙNf!lW{Yދ(u:[$$*rbta)[yo5nx˶&) gj8fP.g”\` ܚ~)egg?;Nm@[y5Ä3Mi+kl930}틞&^ڨ2Z!HMd'm0Q6BDݽ򇘨Ѳ:-Ye&CdDJ0nvĺi?icM;l IA0bҶ"pJJ$W&Y" 33bbaֺV\U9ГW(Eb܈hK`P{kΧ:ȰnGbAYC248>F@M9&iM $LNI41Cʃ< g1Ѽe V h몢]YߛJD; UIg欰.̭^:5`^HF爁.'iCFjFNIoP *T],v/).)z:w>rn,h*@]eQ(O HcT @T6dNz2wZ_JFCB/|d]4cR)H?ycHACϠjJ{ZfA8>xe[qv{fOJlE̮[=:r񣡣K/ƣᅤQ.F+- (jjS& E#H=82),!KSbbdN(%]6{[ 9W}- . R.dQ]>9nq(`9q[ <389#.]=J~ vU;hcaup>/TF赓U4j¬).J̢D-[Gggv%:-by{q%z| i{6Sn2SK(ـo~LѢ||d^g҈%O[=XMrtDj E~`5/՘f5XMlP"V'bҭBݍɠ:Ih$0)rTo<$1JMhiVkA[c:JF]BH6n|vƖy2 T{ Ϥn<ᅵwW}&4e|nG8H&* -([mk1V 2 qw/S]o*뚞y͍ |r=7͚Gt܍Kwxٕ+uZ]/ږ>=V/;]r *(?3%-m˫_ }qYdt`JC%y"'r2ZEgV͑䅛[/u >ڱtf/@a0IŒ&N,3v.6ߝ'wn]YUޯ-t]*2RiDD6RHH֓#{zި=CV:bve,bWIi~/omF[jBGLݫ0:;N+Qɔ94LSbsTuELuEWԑPް0nM5LLxi!yeR- zaz/5ɤ;QY?k0%`˥rZhvzz))Tr&Qfv|\O}/L$iߚ{WI{Zbϫ^5 rFUi RNT_ VB5J*5H+ *￑ȜQ&(M a4/>蘆@i]QjlZ\}jim]f~}%}_qիRgLZ1]݋\Pط ]͔ͱ ! e&N2$)pb),yЕ*sxbxECH8uED4:snuI*ӳ cv<5ΟNߤ/DŽrc0:-fTNԲȸ,%"žMbF+ |t0nE5>ql"XR'L7ti]1sq9̊0&5gUd 2Ak,P먍v ؃ʆr} ai!b#|ӭjۯ |R_I1+s=tcѯ#Wj8%͝=Sב، J)IvZqĞEtá?CV8㼢J֩,0_U}{IgW*7~AS+qh$\-7fTw1]؞zGt ]cd_SHh۟UY>Jf=t/Ŭ]AzuO=qn7ك~@Qڮ5ΙFt ( %(E&\TRG =)ckcGdct~n־-{.ϭA[u53-iN=̂L'l|#IVrD"vK#Q$w o^fWY{T[:Ls&H9! I!e6zJy4IJZj9jc6x-T-u.Vr E!,2_Rj<ֲ MRҠ@)K1Gc8a)X ؈>[ݰ4֜HH XTV1pI4!Ȕ1g 9R$Q(=C7 Zc971|MpeBIem)0Q9X#O9l]tDӛ6V7!5 A~%X" tsr虖{&b2% z DQn"n2$dx-*xE={fO&LɲUG pL)y:Ш)u,:PWП*}Dd$O;׷Q "g!x55gQYr#{fM[`=?Hbv 3)?[tLK9[{| 8K5xhFOɩ؂S#dHE㉩,gD赃?B0S0p.Ɇ)<4+b&eY]q\44e\+)T֌o5֜A:AQv==T|l68υJV_-|Ԙf?|QOhB3/ mD4mDs4u484fL"F,x1 zl݌Ev,S98K(vYo[yQRóQ=C]|.i@d.jQ ӄ[+ @tW"^z2&#^[Nk SnnLxԄ8z?Ru;4  )Ҭ/T˒iR3g {omMƚ !=r[zd=zV*Ly X%gbyO-y&#y9.q̞ Hǣ q!'Zz͙žwOR]*]⚞y>ߜEk]gtYxboM+|9kZ_Y_z;K.G-9ިv C&YZr5~o;Xⅉv kN-z&e8`H v)ʪcLJu#Vy)sʑA_4TN*`$1YdG#^yG2Wsl)Եڱ^UC;п|^2Z NN:EkXuzpp6_<\uTǃsbj@#"2󈓐V$P0JiD+=Wۥv~U+Ǧz`鼚j[WC5;Gtq.aT -ʀͪ&xl_@PCi+Fa8 :A(`QҿֵӪVuJSpOη5NC(w3Zh쯍&UTMq9뇄xxy¬o` >_;8n|a1?ꮽp:oذt~ gf}}Q*Ć䆸ِg ƿNȭMt봋[ROc ᅯ&۫tVf{V?ͺB,I$xWj]y%iG %At8 (t>IxϽ-#yzKYלIR7lCyO˰rʎ6%'mia$7L4\iζ]6LəE",@&RL FѣG_]Yo#G+>.Zyz1la`f_h,22"%mI(1,JjUwUqHuqݭ~#Ϻ֟d7 گ3 cS De.FlF$z}4*Hgmy<9c7JBW%~peY[h;hK~JW&(=X:PfAY;6˲CLˀQta)E+E,F`ULjˎh .&eov݈.DacR':cgW Ԉ!hR^}+:܎|gT+W:u YF$mC.YX%ʐ|r "ڜ̙&VxGzFސ'㼧)s{ ID/X< )h'ej ^ObD&Sz*'GG߾c-dY 44Q`Q=mHսٚ~zZ}G Pcz)jٝ\?F|W^tw>;aBBA,1n'΃Te aPNš$a0~HXHSTdKQZ:RPi K:5h%%6zslo4۟O,~}Bkti!IXSL>:^RYQZTvr-Ç~O[1\Dq:n w7C/_:eV)ՓPZʭ6D4ɷeYn>dɲNð*?93壳?t8 /FEM9ںCi;q̘8Oƍ'4'u=*rvAjw ?Tծ!՞o!5vx}[/nΖ/;oĎ)-{ ?ϰ-+=A6zmbw܍yf<UBͽI\8͟oH[_y¿ie8cuYx}p-;i9'myK*ϔe C.S ]߳Otyu~ѽOޯ~oE^Z_,m2^=q}/Yk)ݎ]s;kܫY}q- AT ԥ5ś{U#!N~q?VWHB ^]T0yd:1feّ,?ğ}ُDV7}j&eἨӴW7wW\NiMK}z8'|ucvDooF}@sHov&n bܗt9u(8:+{R󥊛wgfc>[kZn\};CZ+/LTJf+ jp;g7NG|;] tu Oi0J#Mw3'iLb)&dԺ:M_L\di26['b1z\l?t}ꒁ=lxULjM'"`,0`_HPl%hAKg58B5}\ ,S}A"DkuAlfZ.![1(hYiE[ttiI1 U 㞉~V_[ƍWv}'G Vj U-gIYY▣TZ-v0xFxhA] .EH!E]L:vc)ɗ`Ru-ܰZ4kSOʀ|"z`^Ca.$5 $QM$I@is2' J[dAjy%\J:]Sf'!\~8_S}ak@%M,O..NlMH{$0˞7[G:|(*ƶΧiF7s+Z4F{oTԚ(.;-=ٗ!]HdU~8D'`(L( &JkʀFk).*&&n!^<ѐ O tX}ۥ_oY넨 }!e4[cYK!G`i&3{K %^-9v+V( RuhuWyo}W]7Roٷ#'vFvsOnD!+L&Œ#-1QPSw' :u<8ue CYdjǂA~fq*D@A8mPE FyQ*tR|ةWmL ?ݠAgװhAy"eC M*릛HN fC-$xY*eQ"*Ln4}~0fC:?]y4ϸ{lWrJ6wb5ȣR邝Otg,W={Q8꠹UFHJjK)e`\]^h [qFH,.D 9-]Nx%+iM(@"*1IS "fyU ZxPGl_ |/U'z auEyhWyö>K<z|?Fq3^q]F11` D"G,\XU> 㒎-)@ŐJ5jn3\f<X {1;U!RfO_j8E5Ξz?fൂ>IA%.&)DJ,lRg@C9dikd QQML 4Uߋz~egϹो A*Z c|F~Q҈", /O ,EZ#jGtF?i.B[h хBIDV d6BaIl:C6[' Tl^ p jL|e 8S+saLoBlMaRIRVkg!+"T+RWpE/|DPso겇zSFѳl΂$9X`QF2  Tr(Gh2hG3 9T;|c?"բ֜GwSM '5d0N O[)mɦ8P]Q:v[efVo1Ɏ lnwGj[ݕ8OXZ,6I~_n~1-<׸_aI-$IHI e0DI$F 15fK`oAz$ N*~z7[h ~?Im5we_ )&|^| RON&79m /f;8V}mnًhXcWhwÕ%&.~j\蚈mjv-_t]ǔ(1 geU~٬Emʐc仚RsDH H^6r>qp%}܀JA)*Zņh xl6RHY JR!'i5.NMJMYD׶銩6+%c ;/ti] t_-dn <#ϲ>Cj=_jG`쵺 $,$ob4ʈv|h.iF|zur}N]H֡K"a(e㤖;un% "PEt=BF*aP &z &I{/Q%'. ZNO-*A,Ԁ9Di IUbE2HQhƬt, dX,G>b61F]<~Q)V{Y {RGmѱg (1-Ag+'d+mu쩁E^XQMg`*xLK?,]|THI399SoܩАŗ~dGhDz%{4ewg|p?REl%9V6E=肈1t$1ײ#@΋bsFdV?)!,B8l.XD@b [zKC)2:tҚTZ#c\pdQExXA+ { a/:k5H";E* 1H̨Df|j(l:aԯ;T` "q;("j1"∈Kq38 c`"&CTki1!iA7 dNLgݪ89rT WKdӺQkq 1ec3|p^wԾ<֘G-MdfϛӬfV |;Id>]7nN>g]Ϻ/ &Il>[̧Cj%]`L߻ݘZ7;83F'g.ʔ~s_޵q$e`q>CbI8߯zHz%ڦHpܕ5nЛ8OI_|68Luki&oOY+3C`dBM \PUa4;`SYl3[x8νGk5N.ơMMhUŪ|xˋGyZ73ow~^5 Z@Zem] ֐ ($~83R#(h1qC.m-9JNFVZ>Fbu!U7$Ua,|um]?ܣFY˽nY7.1Jq\tfqs]?Y5'.MK)wF1z~V)'9f%`qyY\'%ns'K{Zlf~\KŐy:Uz֔(W(U>BEcv//efF絢׃T 8LN:gHo#:vTJĤ߶S5rI|ˋk|돼Գ<Ls6pyMTB&G9gxX.MMH^N'z1SKk ֻE7}oxkX_7[ڪxY1^فwoTߤW͓[ I (gX@Ʉǃ9hٹo^DBH +hEἮEG Ԗ|j&׋#%"9j*|tr*v5[8<[@MI+ɇWh$";63r(:K׷*nz F[]{)#SDf@HU6kmd劒-)5~cOcc6>mYgk_0]e6p~3M96s*v;,20>t@6!V! ٷt$q}[ ;U6Ұ)lR(dT{JbV:"Z+(]6Z(rBP|ʅd5C)e"[fb ::'!8R2b BH%[G'1) F$l[s ҕ%htY;Wl,5\ % M RV9( \lCY>JcQ[,FJ; % j.U6R!*gmT6%ÿ^_ײ V h뮢]gM@Z[\{=Lt6C;1Ǩak1vQZ+ZQm`Js-FrFi4YZ3}~+εOdO޹Easl.dZWo<~ƅ9]9z^B=:z&td A9\{}?8:aQ6"H?y> pO 1(dh”j J[)!oPb}UGy$V_9+KEd+߇rϖFE0R+/k֜ωn6~hM9_D;#ژ'/&i2k{jL3B*T$*Mґ(I5u4yVfK6CtU +o/;CoA3{izz-!dT]r۽}LNxiĥxFOqlTXC[娴 閡P>8^K/(w2]jAmSBV/w`Ж" jp^XZF{*wϹD)e\D(tIi(U`jfC. |q8Բ‘.8WóY#v{Nu=N)kç}G?bЖSHSGr,ii~a#.B6Y$0̐,)!x&Z) y淘MIOq#Z|e9`{νHԄ^9[ x/""iҕU TZeTQ!K5:m3\.۫' !VcI:݆('UPyw򡐤@uh}qU8cҴ>cjEƞ s:fuV!y-eM"2T?`C,OݵZA.e"F8'd+ IEZۣg"9 eBDl+R2 -",g$9Cq}}F-L:nYU.]8(\_7?uԿI7̈́s6Y\o u<;;ZϳEw,ti!殴dKkс~ӑ{W>ZFvq|zJͽ0a^}5gK֣WYƭVfy3Bc_-ؿ^Cs.V =z:"kA ::pkn!]K>;7ͽH Tg4Hxj@t7%mWz/qZ\v1A>,/x.o{ޚ݇gbDe]=xѡ!Z0HY+L&G*\C$8mߥq.]6w`v)tݛ0d(4O&xzv^Ew҆ j0'=1ꠒ7-8 "1DhE៑ތ$HrS'#ɼJodѸ1XyUkqO*ڤyU@[5?Q>PEX|QsR:H2JKb?Xcy6zk(eu%zG7ҍ/NX2nިJ '4p">} (r0 hlLgatT$}{ߚА.%Jmѷ4ĺ_ܒۣPۇ: Cu=zW\;JSɘ ^}0XKZNy!bY,wl D/ň%uQzu\R3U 7Nh؜uukB]'f] n4|;}nH׶suLڼyMZv0]uYM݃YϷK{q]F"}%?[pݬۮ3A='_y۱MUMlJ(CMc=)..M`[wa^CO)3R͠z\+cOwi>yZܽBZ7SZtJŪM\]-V˴]WyGd&.V-]=y?4V+"sP6*RÞk){"Jnqi_xuX7aaM:WQ;oZ({h)r;v eߴc(P?gӼG-IXci=8߶ BjҺٴ>=D6'LvH-jtnG=7>yAY^^T)o{vo>g w뛏踋yP6y~ms{:6yYE&}|≯2;m9gr֝G>_Uּ#sSFɚYwcѓȹ~}X> s֭W#{/<MVMrNx'Tx!C-C 1t ʉ&OP!R 0P"ZK\iN,P\~7 Ϩ)y茷AgeJɯQO0&Ji#ҥ>hgy86 \oGiEL>]W *+A-NlWv_<ʥx"Ң.abZ;z FXƝ%`+<ȅ1*Y/ҁl:Xa$3"<6+B&noʼ(IM!9jɭ6y O!ʼ Gq4fMT('cKD8DȡVWgA{s V_(u2 oZq(dԠ!>_ŀF ]w:("q($YUNZ[N9@aajzaOiҭxJ̧76(7)B)zGw/H޾{{?.2h/@b 'a鑴ZLfhb`NsZOڢo'LN^=T؍lVB{I(۫o'o]״i0}D*yb6Mw{xS"~eɟ?sm4QeN/>pl,w|Ѫ]\w}昒ly%ai@}JtD9*O6k^͐Yˋ<]ezr׬Dt97lfׇuJ#sbf=t]/HgNҽ1kF'Vh_v1)S$VSgC۵y|s_?Ͻ1M˰O4EVYq[<]&˄斖k;\{"u\' p~uelSDznY99gx:6p~OVie2Wzvsʞc_Qc'cܩtsەܛ }x\vܓ7uHҤ+5RبM`SRnUbp">LW]t1物3eV z` I%t&jYXlNiTKүYe UNjȴ^iinδՁ Aܮ@h$BcUn64t)s%'ROWEfXh|"a|X9og.S=1:nK g^c0bĜulAEv&U&h F㲔g aEVs*+)T6}CsƱC6 h}"'j tZSΫG,n|őR鮷zʢ謴9%w\,y>sA )^g<(KID[rWɤvvG1 1NEvNX*;S,KN- xLu AZ!Dܩ`@%ph>S̕*hdvVm8gYkW.̝۞%lPAњb2IR*&ZRI&$K\;b,Md2&P>}/3p?AJGr (%S&gEE+LgFVb9, 8u#ժGz6I3aSy!hPG!-y Zh=GTU9qQ8P ]DO3<56!1Pg҆IѾJ ! Bsh&s%27%jj(Wm\j|m0rʲx#4ĔGmgg*#Ku>G=ؑ :le?&hlʨI6C>W DDl` 61=AtoTRzy3!@jblPd(ɠV&{:{6ڷc̥]Iһ$8ݒ`%k p^8 +v uJ9yXΈ+Bb/95ATZ"$FL~N\ is91Q@T\FH!Fϕ\ Dc2xM : Lޣ=|SR&.˃F98J6c Iy%Zp"D--jlse49X7*+V1,}ךWHse-vŒ{b6WwoAfoj#69V$pY0H'BJXHo@"vj+u8nCq&cH346+FimeQdָsJ2#1=z[Q|pgwDАG!5Ҁ J}eׂe0kZ9M;\RPUomf.fƹ"Z[ګB(JT45E%X/ )A\*rHB@+j93qgCjnw"j12iK$uX}38'_g ]@XBIZhYd4&c$Õ,& 2gZՈU(M(bt 85m> I FU$C]pNunW%cu߁18g>ltQr'6~jMvQ9?,z2/j)ECÌ.,zzZe.ה>J7Cl2ZzzGQ,G`|&$I'HCߊ^ w^z#Kd_bٖƲݶedFdTǣ Aj!#D> 7ʔu{rL2T?!nN&^.G5$tpc3&Rr`F*Rv'~&E#(],19"t, #?93.lceU^p(zOiu{Eƌkw炭C띮.+4V-$o ,n- BCkw]f7K|n6i7?9@3y?Lw>oNNTq7S:%"`ڔt Jvxp/$(fg$'\dG<=9w'fя~_hIpÒ}?wi*C]3G;kF#ZEAȉ(4kStr> E+1}97>NZ ֊^Gbn#~{ Kh!Kbe؇KcSiBJguk]mY q9GL tv`th4lfr4}dsZ+8Ѵ%1 p H&h)f'DZT^A%UNC0n U9ɒ M\Vf[%@&Q V[uFK}Z:(;.qX܇~:0 kN8ʒ0,`2Bga> YzkΆzۤfg *~ ]DdStb6pdR.WeXdNzGwY1M2ȫ _^& 'CȈk&LJ"e.\Tb%PdP?NO~By{BRUNY0$}$I6bvd:C*(ed^DU'^&Z.M&rߌ7r2!w&sQ Vgu!9Ũy$0)WS[6@oMDʧ{VV ԃd 9Zh6R͖le9O>ʔ#08H&yUcƶ7 vü MQ+ΓN((OfGlr0T0%5[l6m{*Lr\'|34%|(@WG0mɨAAz}=p+v.mD\J}e+_}L.lТPsvx9v:早,arN[yEs|K]^Stdۙ^w!I2w< Q3"2<71"ux20`bSZA쌉OM:S6'T3cn\5n^@~:6??Gg5_. g_#Yptc^r x5'zNtcZe0ӼHҊHfEk*'ӤKjcH!kI-'Zik-9ޭdX s]k2_ަHz2"ȈhB+7 TArƵ9jXVZF>F`p\q"kUN)B(u)ҳ7X%Y}* sHsr݃цw2ޚssf MiTqL,eOg䜴%IO }IO!^j*1k!cW,1)ᅑ %,CFgE$c4E%Aಋ=Ī} Y, iIIa2!%o̝'hY}b9:~ySFo S2:ʫ ^q Z)b uAÒ,9\2^D G$ta~vlϱK1 Q"i &ɌV`pNMɑ"$-T#Kw[Ѹc'P9 P kPn" 'gO[)GT.XŸ"6DRܕ8`3IUC2MmH<$B!#d gk3XB0'`1Jtl1rX_EKhcP;XL2ʒ,NJm$DqL?*Jpݚ5M8]e'%s8[An'uj@~EQY3ufta:k~.BqQ,tOq|N]iqx0ҀD(>g7U]CH]H~<uqI.vźa4.j8 뀁tXš%rNY"4N4{7Yͻ*؅^󰬙Ƴpɸ\d)aII w9oUb\ 81=0Z-n2IG~p;9!rݛKo19rLC@I]bV{rE [47S_ٸ},J] ^h:F.̺NغOE|]7/w%nmY>*2'I Jͤiv_&vAG-!Gip)c]eL p-Iˀ=qI>ʙqlzl}Cq\wf;ci扆-B{'AWZx@FI-'HtEE"{ -|&̳ .lٍY+<慟/[nyZ$Dі*fL9H*'<lH'x8Lj"wWNݪ IP{Ow NK+uV̭4f"P*|zOkXT6[n-:ֆ:j\]Cm=%%yqkIZ/#PN\PgK ' ւ ozɲNrVЄuQӤ I:=CDT.E}o9=Z%z`@.A)2S@&T0EC9̤%nkUc53#ۈ~SӦO b64;gJ.VdҴ;q4yp&Y\E(kOM5AJ /i |]T7;WvVedgٟ_K0/iNצm|I"6EU X[\SfNgI{F+D#> Ǡ41_&k.[Epn+lSeʖm,]@q s`7iNgm7::zJOe&VxҎI;Jzx7fp@% ZV^ tXIt(Z'EDU"d*Q`UQ2`R$i* b$e2q!Y ?n\T=*[_i㋗іxYڠcMpX@ %LZB=ioהOQs?s0Fk2_MsozU4nx?)n]S7.p遻w./}]S%_95k eV7#5y~R4?]xqh):E`R<$2bA;&Cr1H ΢U9 G#Ǡl=˪m}*evͅ{B{2aɄVrs0m{aՍ.&kN$Rr])BH%> )x&ɚʉۭɲֺHov^_#xxU(E-Lۛ+O-V C],,B*)"3J\o.o&5gkzo18~DeؿnPGpWUeVB}%^k]a{ҴIӲ3}0M v2ǹUŐԗQvF:!Җ᣺@) #\Ig$su}SZ]z W7t-h%IkZ)CU-&yl"tJTăP%FNV7]ԬvøOzR|۶kEkmUAŸTQ$TCԿ1_w|ydT\F{_Z'꬐ptr5i ײ &͞tʃTKBha|@1UV& XOeҢ R/o}{cL 1'(wC8%ՉFI!9<7VڱS;N L;1Yi|$ϝrTZ ()8><\UQ$QJ4Z:5qƞw83I\oYKH-( \~RMg (*m! *wpz嫳B@,Eb'Tr_WO58?̸$^Ͳ416K}3@LP YAP9}]t+ c4 XEe2@JW[J2ћCQ_rd/kU_3J\fuQu=)^{*8+= Ut^epUżOkG!!!rPиt|S5mx1蕈PUlꮬd΃u.&7i(`Y;NBKȍMHUYr٬t"BO;O^y/I儕mɖŻK8۪i{&O`k:~(ڣKiihx,T -R3~7=ڀw#K!9gϲzLyQkнQyS21@h$k$>0nY)x-sܝgCP:,tfsQoB3;۬>rųմk~З̥J&(m,-)GD+D.&/0dN(0O幐Q2Q!)j8A8BYx'pb OHUɺ,5qT]x-*WX"-0uNgj9TL$Аsaڠ2əȹqT;*E "k.3!)wLjqY+/o5y'xxk *˼GlCݕ6}e?L>1q .OHi !RӄrWkˈDvb<0xzNG+9qo$: u{E!P`:44.qWHV<:ɩQ9qN8OV3_ ?;97t)Jd@B"' GRA'- k#CrOh[sV =Qc[#WĺZGCV kˮ8Ȏ.x&f)Fwq P4Xm%Zxٹa$&)k@ 2ED Aq'6p6O}gg=S'tq9U}zkM:=!w<_;D!59z8h ^ a5Ӹ%U n-> or| '<|=i1ka(VB_4}7aR;]h^BS b%6 9y~y_k܎n_;dQETX@@T\  =W9|%B(~QO? *~q2qћm@WU3 MޫH/)Jkni^ rVo|T9a6EoZEpMCҋ^Y,1;|!6rD ^1r\%j-vA_ [t~6]h%%/!/ ~nx}_ T Sz(mbnt\%E`RSoA߯hg49V=|ٚr7o c9L$gTV3 e電@W1w#?EA.4_R:"WNqG.r M08t#И![Gwr7#hܻnu3 56)'gd8&MLlJfǼƝn*+} OSYDlY&;6[zb8!juhݮqFm/,wZ;9λqO,B}{Bup`<]ޚ̠@ %YHٿAszͅ^仵B2rN])3hcf4nQEfl=w , @MH u.v m>&RJ%‹r:ӣ(vڥoӝ7v0!?%'P|a]Ln>yUL>F#䞯l8Wʶ*[T%$XQ/U=yixw(~g:&MaûEm9O]*Z/RKu3QUjqF6^#E!7ߡZ!wL,Rx-0ѢA*ɚ*qlN}P۹K;pٹZӤ'<0Zop5G]T= M5{*GljǠa]>BKÏї+u3PWȻ{q f+_^.y$".ܯ{/JͣVwC4?f8p|폏V9>;kӆ+ZW=leNY:z{Uwr/x*s"T;޻kU6C+۴9Kȋey58{(PjA TUb2Vc^s*$kbޜZe:ŻmCЯdW؊kt4[!dGWrc藓p֊[+aՖ}A B*޲l',7`$ඒfk$WRo[n]0~;uoaH=jVtd Y{,K ' Ck~E^KmJ-I*̇2OsBΙ4)RexIsT9+'.RJ&e*O eLu֜~4adl [5eo9r8XT"4-4q$gAR *s 5m=| N[ j"':Mq )x  >|,WBMqi[l*:(^h50#Tv^hqi "~cop)%)}p+$-MD`IFJPQ +#s4a %qc"60+kÝg) O;t*_{[0"2 +W>iЈ:Fuj|wlC&u> \0bas 9 A啷1Gh*!dt6iZ;+ n?#&jUȈ#@زوĊ='Q|\ebO4yE52krI@*㜰^EaOBCT7[A<%X1i))9Qxw$n IJ!K:`z#@ضHQkRpuh)E 0JQg(ך1|!h֢i)K`r@"f$pQ! ǹ*D&LQ  ,BX`m-!0سjKk(x#=OIlZ60T=NrкdJ)mp<`-3t6FnUdmQLȽ2@t=oJ@udP8(xksAuɑ(.XheN+2*Q[n` k 8uҴQ lgdGN 噴UAŒ5PbUhrJ!ɠ-]Ӡs@hclLzbd4TąU E+}klh33昜*zK0JJu X-Ұ1kP sncduUc(EH.9msC@P*ؒ*` m)")zg4E?!oVSd \"+dP4WL%Xwl`@ \% .~Hc`P*fR_HSo)è|"I\:H'E $;&\72wbF ɸzR !X !`GdEhH}iUkI#s5FaA{ŘE10ts\"1(l1IeSI!0δr*Y2@VId*Y){h*3jƀ(Q.0QQzA(X5h*VRP6 ׿8U ;ve"J.-%u$$ExMڈ &1bqѳV+D4 5J! RNUnʰkQMk?f^`1A x3V MegU/F%Vy' Wc@JRv[ "1ȾAfت3&]\Oge,jm%\V01mx2AkiGH%TԟU[БJ'.[$ZʕUzTc AMj 5:".Q%(|"èiYQ`rQR0~cy EduF/qkJXG'R% fcv/tX^gHYtg&0 ֨H `f)T[ jTrV-u-=Xm^ Jw afs؀$>o3&J X]ҹd]K~c- yye:4=oʇi˲ʹ2V,ԵEnBzˁTE\1 0=MKQh.֐D]k Uئr8JV,0z;p310"g 3|9p7 JxKTM !u>fTs\@7(⒵Z6l!&\[M:_MC,cdiP _#!2.P,9tQ@0Z`4rMTFSq1yTvիnxq>a*5غA/-kmh%X\/OD{fafk)tjTxY'}]ܗ2nov=!=N9RgS j/(48%S/^ + %ǨqOJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%ǫ@Q\F ՋWJAw}J Ł )H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@H{ ~@J =3`@0+}(J X+yJ X)$%Ǩ2NJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%ǫB@Q\F $1b_VJFJQ "%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R}DJ[Zoћ˺մ^_67/6N|(/gu;@1g@`&%/|1 _Vc0v>?`CI׳_\+͖] 軣1s Eː˟NGw _VzꏳDot'ݣZG5-Y]#hΖKNޜQnqFuu4\8йvn X!J˼]fƟG~5*YV|~DkdtVҏt|=Iuf6h.v$Le&叚VT'5lia/PHW>zWSQ69TN(ZW_߲VrJT7'j-hoQbYH/~JFT}RG>[SM+hd[& Iaj_@^\I땠}JRVDIrUS7J5(Ң:uDOq'#w>o1V+փK)7rm*'9 Y)hGu[TF%曨Gs_JD[kLk}?%my_ejK)wwd6^vdd _m1ÚQ3狇܆%Y'  rB1AzVimUBC%4g[Zv!:V{hrvZwZ9YC7 (9aw59=~rvy0ɉsÕ[gFqlɩk=9\\d.,#G?M}Zܼzx}h᭖>v[4N}[q([l/y~qP6t׮w% mx{reJ;p~^cN<+,^Ŏ)61F`>!~yb Cyu//sFr<;lmw)4 /z:d?wkb^/o~>^M?Wm[7NGHzFוRc^;+ufOJ-Eix *%83ӜO;ƛ9ʵwym'3(7^DvT,O`P>aj9VWJ)zYM=LN>K$:3?AőգpWbq⨽E'Wݭ&[ť>C[,-/+gn\#y}W@ x$rM Q4ş$__Ӿqqq\ξg'[qy-8wq&Ún\R}J-p`7Hb_Lm=sHJ \?x^`zs77nޘZ_>9oƧ''9ǽ)&3a@ k-,c \zI1m=)8JEf}Emw-_kc6ZS}_2_#2vڼ|BmHC\_vx1NG>溦I o7:@ZRGxoΤ~y,TqAV[d7?+\9n>Wo %$"ce>=8sgq.m[*{a̭Q7yeJծd29z^jLÒg1 (twx:^\++\{J#&vXv(泌ht[xOξLoӸ]Gۧp] gp+j?Zݷ^ⵍvkmL<0aZښ5)?̺<ĺ ֫'\uN5pd%i0I]Vѥ7(8AN̵ʲgh2Uk=7@*[)ԁO.>o/m˼rnl{g]T9.\#1'lhiM%I}&?[%Rk*!zn22U^x͕ '"'ax}p'kOާ8iv/,< ,b1O 8̯p/esjA޴7f&&&dtXZe*WSZC~?DflFecl/ Z,A>ڼֹmf޷XL'\.0Zh-5@7.ZJ2_.c\E/G .{- Oƅ 4ٛ^dwYhml^cBx]Ux;w5$a)s8f%ouf|oTp9䎸d9DwiMY]sY2om:)mw >nht{etӍ?=]1ô&jȆF6?`'KX1PjyoytyB&{|gpfxv\rvҘfss&~/&':]hV1l~26Y=mGa1wk3Ka?aiWjA}?AEؔbK i]2@H2^pyH"~'y<:cݔ( ]\K ci7/g׿/\휽޷}- J}4n>D  OPaA@T Q#LR9g0x/uJsb*$Pd~ 0<0|v۠QYRDkbLV` jZiog!r6ԩ ܪ,MB( 8 ,LrJa4!WfȵBHR s ՖsDRfxNj!}b+ bOcnGPYq؟{E]~q>mgiSUAjrbZ sl02(^CLL)H< c' ֐eeE5q{B@V*<![Ў%Cj%1jɭD؀/_EA}5aj>`x`8ޖȗw˿t нkiiZ\~6A7mz0KDfFj:ֵ%ޕ1݌xs{}9h-)+/}@]a{aj[Mu쇵$Bϻ4ˈm=s?}}@DQU_qDl2N־D\eѯKӿ)ve%(L<=KׁGeF1誟+)dLW}oӚ'}JxD9G6kq3aw.W1w]JDWzC'1QsCsteլ/HG?W;(b6o70>^&gwNe杊_N3~MI;`IYJƏWJ 3;SC?|uWt5B?ػ,ߗ -mO]F z8zHѕsYLp=`}|%6訜#rV8>çNLLbJO·#~SҧzjkKUgUg wۙ }P*9ICf3eFz?3Z-Vѽ` R͇]NX~u}zbЂ'.{t&'ڨQ#&|T^ !"k"CstN.(65hƨ,L%*8!&Р)uV\JdI ee\ң2"V[aDrt#b~NjRr*.ʸ\pqw,I;%"rI,O6,sX7X?<%⡨'?U*9jS}})At6( }c,YOh;=m>xdb 5zaBI[Ml$VFOEzm )I>Xѓϡ<.rN>qG-(BP4H̙Z0#sm]k9Tonqi\&9\??!J)y\r_fc'nnBU]bF d_CL0*RcZoeMM2 ⼉,Fd׃_Ȟ'HqsA9&x&A\s JKe%Dε"nn>ovNu]Y&fܧg ]@،BIڍhYd4&1Jg$脉+oLd:jXvQzzVg.VSujr[lW++M~Y,v3'U0[e@hkQ ys JYfu⒟ӉFlײPkU48F8!x$$NT @9&!Ɍ=`2d!x$ʢgw31l ]N267 !o/VTλ6 ?DQ8*(C7 w{I:qOeyΌX1ՠU*9VHu@|Tywmgܭ uF]iMuvC(&TD V5 % 4FXlϝEIk),1 5Ge[μδTcQcFcM9 V1m2Z8&0Aer4DEϘ TsD5پՖsD%/s\ qﰥBfp8Ok_K3]D#oeQv>QWՊ۸dodȒgy9zL>z8(KI4D [Msߞzvt.G'R17ة@礋j<&e@. XH1-JX #wTwLd%phI!5)2uD,xYYϪ-H=ksR3zffϠKǨ '!;+&RI&$% XP1QV&Rߊ@}NƐdJt/sLJp2HK,6ǑPdV=n&4I+=e^3fi!ˡItG! :B:i83~z*qV˪\QZF21Pf҆I~ ΐМ9$[%_9 'nؔ \gS|-C]P2[2cIH..IA8om|kGVI[B%њ+dl]>{h6o,e!{9qʆ&rP,Hc &N3*%MX8o';EoS4mM`"V'RK(vL_o#7; wpzt6ڌñB@RD]+H &(ɔC ΁^?dӌqdZATh͙8m|N9յ/\,Gn!c!c{pdɖ>꽛Vfۤ\TlEzhyg!x1y3E{8C9-].yLhjʃA)E Y&T0, 2: 1:ID1HKBSJ8 ZTIk^vbz\ V ]s-M š$2eAXl NrNU IHU^TC屙P4(5">;̩%afH"zrS2>:4ZT¹MHs,<"ڤQjLV>r=PG@$قBBY,! p`(Dl98RK +23\H%LCG?c#Aq'tF p2ٲd%YZBhe y=3 TNi54;Y,tqqzqaҳjN书Merǎ'dIשѥe4 &UJJ;ѼYXMWTeJdJV9G|C@X40fּiCNÆ*=~43k2`50cK+NA5 r&FwEmR(9qӠL)p-V1ֽY2u:"FR g\2qtΎNe )gGoń䘆cbxrEZH6n鸽.Bf9~i=#Yl&[LslrS{̤?.kR4/MG䠃QrȁƿY{3buٖ} P¦/̖>5jI3p:Z:r_ s1du>*o1C{Dv䓋'ꌌ@9YvriLD)$cϐK ',$"IueQqVЄѐRyE~Xe'``N@-"hLQ^vUVV}Bk2SCYIːwGHP63%cI&Tĵ\WV*r{y*FCM! /n/_ThR8ί:SոyAzP9_}U-<OWiOKM>̮Rf W jTUO ؆x>O8-{tgaM7Ӟ_̙~BPX#Ms^n$dIުf޻4T5*'ѧB{^Dq54"MGS(^! aLpDq0ԧ#jcWW~赊gJ΋q9v7)u$61lB%fɡe.$kH"9dMw+Nv<4r/~u#V%f YW&op3)q$$^.ųjbL2竖.qޭ"OTƱ(&3q4~H!/??> gMy*K21 -0s鞵yݗY(gDՈ{d'Cۇ,HؔOHpBABiPՁnEMO.)XΞ>Ř lum@|Hev <+y@5t5j lPF^xdAL'J5)b֑[NwzZ,WdəEr ~֞jRIJ(E%眍j%!FI<'OJx=EA>-[֠yn}{]w,^ JA&,*w?>ͮ T"P] MNr糲8 dJe71+IF<CGG"hvj;(^;e)'񀎺DGfKtDb.&~> ѫa5ApPD^ Q*.Knvi#V=<PR-p&oщpƸo̔( N2m{lE)3E5M|;Z@#Cda{7JӢ沭&k;y-f'&(mՠ綩@4{|ˬh<.{~jHBf et6/*\ri!Tz~qu 5nz<ck3%Biu`E(a]Oc(EKh`\y(ƒHobWa] ,o5t6Lx58uXcU7EAU>bqZQ*$HkY;t9ωC aY}Yk>MKy`* Z&"#7߻GK(rrdDͣxdlNix0:@nDz?FcAFOgJmG7&~4J u7Yux6ѩ[;RD p(^ xBj['k#JK|s;{xNYbѥ`D)W%pMNԵHF>{ ꤱ0kJ!B QȎhO t_KͲ#oGDf#_f Fio ָh)4"@pv ,*60:-w\ 4shRsM3`BYaDM2U Z]ėE˳ǚm.tt%O8RY1.thhh7Wg\C\-kz1iIi^:dZ{΂` UhUJvdZ2ӑ&[0¤T)S19'-lj઀TiRS`Q1 Lhq28pqNcXyRȠe &%˲rPL J 70c)d!XmiK maP2m[(2#χPʨ[sVtXY7C7aБ? N׾bH_"%8TRؙ5Ti26V r ֑tҁGJ  U;)NiI9nXT57;H; eՆ+mebE ]Vk힕uEx_R]u =#iBիf@!ѿ&伖H LRD ࠄjNci_dY\r#;^ B9TE5M&236~W;Kk{1=͘u48U !ؼ(D0D!1Mfv3:^Ӆ76ϛБf%SdJW=$ V*#h`! (yCs˜6z_fVcQBg9՜@$rAV 2*e(6!+C.NKOUg4PH}辬(kIv u<o 7Lt#+PZV1hFyeD;H6VE 6HA/fsP6DD —9t8$ճETe` x(%gsl{pH[vVhs8zNwVE̓;J "4)Yͱk?m,tf 3#2a,*fa!dE(3 β] ˉ6łk?J!gѝ8MUJ6:&D[vv/ң*^"ƂZF7k@ Q4/^m:hIYT&ZJmsyX?h֞y<]: I0>s#uuT$KFQ]L;nNN"ip$ti6r&74Nd4 ն[SQ5(U/ yHY=yp4vM@Ƥg,io؈-g' 4 <%2`rh[SP/O(7"Fh![^:(Jj]d*DePˌF-Bܰ$-F*׼|;dI9>$N9?O.|62o;^Ef;C4ϋurrls3W/~~|;]D--e?*%I v߮}랼{O V rN<:jNi'ahTJ[׬* qQeQ9}/hK^?U=y=mB⃇󏁪o=xN@f)upr/VjUtݓSymcPa4n5/"ѨSfGDޞ__NhV 5SevzTj}CtPt='ͧF>*.ézO \_٠-BmaO1i[1jn:{j n @+G,iNxp9M%S7G.ـkT9U܇(;ƈdj D~Q+J>Tp%k6(yNvUMLZ 9`nz{Zq[dL=^ᴭiE&I5M {$~y ϖ{jmbҿI$z@p`n5}߆+%~MW|>h. >Ӧ^ȍtݓwi'R@SMWծf1!vF CSтo랼{OLηY9nMX࿷gm7zP^ ,F5v6UOsψ?!(IE#KbXJw[IšD' Q!nْX\rnǷ~=<{y4^eWfbYV<[.ƿ/' %k{ͧ2mwmga~篷6ǵ*;le7VeqB5WmJ,vk ͍9oq--.ljq+ş(_[:Y wg_մE'_wW˞{iV\t"0no24 5PWV: Ѓ=7eۂqs/no޾qhG|乂Oo튷uVc\EZғsc|bD_A?],tkO!Kӳ̎!~߷͓>Yӱ'{ȍ_6s8|?ȇb"d$gVFdIٽת Gri'`V>vM/X,ƕ,4vnipc0K7HeȌf1T!SK$yvl?*r>./d&AD5WI2soHx*%.u%2NIPcM{OĨ6=%Ec3sъ aE#5,$6(Đ4BE48R×obI+W5)Fu-J̟~<+^óJԘpk#Ht\*qeTʨĕQ+[%VZ(2)J40GR:㩤(iӤ8-sg9IK#B56\ݢx z}V_+Gş>:M!]DZeγMa8ʃ_gY5t^bX:^8#Y ᥽wYqG *ʼ6x>x YvlOrқZ#wB A͈}ϻzeVlU?ޞvvcЪYmL7ȲHX)z?Z`I22VD*ue;/eH%*hPPAk"},ѐG+Yņ^:KWza * u`t6Kt䧥IӖ,o?|HKs>{냮WyV(AJ0+ Rdyf[kk.`B?9/Ibpn 1gJbcJmYJeEbN\̖~p}*N/ίiš }һOEvoXQo+OSqPڭ%6}{;/L׫ɿ;M/D/] yv;^.0m`ǂJ Y7C4j4[il6A^#Z \b^!,aFmk"rT6B5Ң?k=!U1gPLk55m>2[OFK)vWB!>pua8cE>nݻ x~3ZæRJ`v #EDA/t~y ?e*9/,M$"dǑ#S@=F,@.JDC?:%i>N>ԋR5)1/oersU)(-R ܉ 11 )t6.7~JJ) B F1>=!Ęu)ɜЎ5Hy/l!cXE3QnZEܞrpU"\p/| %Fňy4M9ȁ4 1Z<8y6#2.GlQ|_ݰ&2=TrƘ"Fי@M/b=bK7L0n }6d+UjRs̠7m(D8lz)6ɱmJ^M2lХ)`$}dɬ=b;h"2kdhJqp{[>\@lO=~D)+etk3>17SB"#x~[%\̌+b|$[qMk1<K#qBSQ$]qjh,HHM뇮gTQX+\xprEJ:8):v+/YМ yF+EJ{O_ xcK.8(ȸ Ls@mU Vs9wR cz(2-"I$ }Z@GZ[qyADne{̳qI|w)S9M>Gw1m,&qR& (V$ ,)Iq%Nu- Z~l%UdWw.R95u_q 4c2ڱ`k(@rk v 8w{FsoŶjF@UNIfXYqB){HNC K 4)Me7FAR$h %:)1[qyAɴR2L3uxx%/Ip+ev}GkK>$+02Giw<;A[clO}jC` o"VdNq/ĿqFzE8;.#3_ޑ9.VI%Cf(T4O'%ҊH4E%0<)|0/44|-4M_2S:YJsJI.f9Q\"'GRp"zY8ݳ 4(R% ),2Lف$M!JIp'Ғ:V1zk h^3[w!$6.)}(ZLU !=* VStԉݷh?{Wƍ C8P}QDܒ}鉶57D%Y$Y$@frS*ow(a|봆_FhG#Twؚl=$BMm^` 2)1T1ǶiG'edQSߗcXĤ {.,9e*Tu4Og?ϰUq(`%U8G.ld{I8FieHڔ jWյ̯rڔRpY3yg۴m~EqJS5u|:Z'700o\Gߋ#3? f{8wEUOKAOd ܈<}0lϋ8T"PbeOzJsI1ď :ɿ`=h]KMRV"sRX*(ڍ[dvnc~mc/Xl68I'EYs.inp؋ưC KzC7I{ ߗ\رn&T|"e$y" ޵3igVK=ԡ/ C xS7ikIt!OykE VŠ8?,NPPEߜ0$acĻѮ۠u3Ɓ_&<2"R{, Q#ԏI f Q[0{T }҃$*$FIDa(-ah(%íĽaHA,nMY 3њ~Xc!4x9Nf&kqʸ&鎴9%b,(O'(e*,5B29 8wݍXxg[[m"4n$,kJlOgVfW5魾\Q~[IکRM$׊N=\(EC,oM~ZSm:b lC&46ͬ{n5he!6Hje X(v:w(De B ĉ1`=E"=v6hͼ;B"ť8R00;_#I|{}hLEF2J^,(#VV8BA=רX{~{7B*z"yr+G2LߚrQ ?7ݪjrstSh^. scg zR^r5|[qnPDs#x;{WҼ5MoZsScZXP4g<ӽE g2[yi'4"*qQFٻGǿ6̛$ÀE4Ev"#RB7i#,[ zKsu14x ,Զ$oX2S8:g(Wpqʱ RJcT56?_&ș8tXz[Dkn>^O んD#.[K AP5yjz0 SͼC ;lyXXgΓ Qhs0yz^N+K>dh ]'hBJѦze; `$:yQonBVl"p>s| <Ol G?QL% SEAOzJ)"нď :ɿ`=h]KM9)Gqx:.vN2揽 ϋٮm'pV O=?g}x?y^<7"G`?MI[Pd0ݧtĀzoftsB }Zl܂칠{tFIX=akK'|ߢ+᣼E$t7Sj?j7n}(4kz-'{ĜS_s%M9A"&rTXHg(8ݧ.#l2MypoV;g 㫮%NMEuƊ2źhl2)wQ; ^gMsRoqioMx ,JC:L_ 쟨Dgf&/|n4?Nu<:JGE rK2`dIS Xdu`zGGs hxJM,Vk'$>Og<`/WD,w+V;[^ xUxS(.:T쁋z~[DTĵjgj0aB&R"DDS:#B$UQ2 NP4+ŎCMqKJЭJ fݪ8͂C39=K`=FX! $RDp})F2)I0*xmom1nmN(L"bԙ"-T*E q#1=S(aʌ24T"1hJ)`d;qWᆠb!~#hށ18DK *i*!Q`OlÄj44"!CB^D<}*l³r`F'85SY׉̍ $Bx>FbR6⢠0 jWM[+H}{zZ[PTٴX )(E]#DH DzD*,KըoX!@=aJ0Y N=p`IxDT ') HX&i@(66Ak%d(SF AZmy*'WJbluwc|9P8k(wV[ڼkAm;2$)P{(xA26~mXX U"foj2FK~,uf0-–=xL4KJRAFZa1WpT a 9!/Cr _s:4~HHJ02,Xt@b8-0;2/8h(l#r&7D[;m;B1 kQ/I sZ#mX80[uZ2 5}/[,؉Վymvb#\1epQqbc>^)Y1PE$WZJ(5%4mk&uKF2BJCQ$NDP*q'M$T "fyl3сx_=^LwoRQdSH2(!BY%B<*_BTO`(^Pl ^s\P|ʎ6Eˁh8~No`nŰu|kӸp>"4EAątWM 0x%C>|q+$!,} |zd5Ǟ USMIҴ8cYfuUW^3)f<<-rق jS2wۄl;#qORs[[#CCbj ]H1_N;`R0OQI Z5#maMmNi21 VOI`h/4Ik\FV)ea('2눼o/M \>X έ vl2cs DK82D*S?HFTZɮBk!б3"vDς{9J(R˶jNzfDI{d4`?1}Ur3LBy^ M85KQvjr cARRB|yP0hj>[KE*1cY9lA Z3&.}Jkh)L͚M6i+ob}|u.*C^m:'h:i>v)rSe!YxOɼ$s }qwjZ*M.x-@שM c]Hԡg/VZ*mWCZw볚*h釫 9RCQUdhۨ {ݺӅ-.IX9  T R .B&(4xb_Cc/ ͓WcrREAy6OpԶ9ڱ̴Xu6.H2| Ia2&TrT 4dmV'+*IhLSE$FRh0?r!HʟOrŠ o'v{z+6*̇q6Ls +w``|N:I~AZh9[ C55D\FvƔaAkh\^~ªԢ.[qEO4p4XF 2q %m'' ^l6+K͊f6Viǂ!EH۠~;,54şlZ2[fLR02pI ;ѺQAU.ӇDKĂQQC{2Đ6}F6C]NY-2pHYYn15D;4݊UI!Sz aUwQKx07e"O7KUiձhgY'cԐ};ְj\Ccs?~=v\*]CيiFDϏO ztnsj%A@Z08(?|E꣐r4 C \21^̧ ^[,pgRnXkCT1K'5'VrgY#Fx@oE~29M54ofaC~*/B,EqC:._q9M2<ŷq18gnkCkzVPq2r1:c/iVZ]d_{E;BXTFʟ?˖)ITZ`}ŅUF @2AlS.tb;6x{V "[׊*@𐃠 [K 򪎩Y%r'`z4zu "OzNH3f4[ GN@&\V(zn0='1|u\_C*+1=gC|VCi\nUA6`jK&m,ݷ>#"΄ԠpS8O%OCƫ4y(4rq JװUiKt+!!^_},4!-:S 2]%s,gtOкAUEA +(逹'{eڰ2Oۏr}׆i9KzLa|opo 8j=x$3'hXpdfYy&11zxN4>LY?doy<l~U$8{/.sO^wlp82x_7{٧ěžEmv)]PTåb{[8[ޜ|7?א3oׁ>ܖO-73>W; }GglGbaFD6[PyUةG++VZQ~ϾB½/0 ے4։2 }VB`jnUؼ.,V|hy()4(;,{K>.^ߙa_/XcU]E j-^ z3vt4cW[JU 1'Nm{ :M,a\ޥZf.ƻ:4ޔ\y``ժ!T ώVʳSdspÚlzDb3 Rn:ǧʻF$[F#]To.lEܦX视vI9*[R]ЕeUzVue]Q0M|9 q_ä]_.y.6갆JPKݧ< }`ևጋh:i=ո(rR\21EVqݘ"bd_ Zw )WM0:˂Ɩae>]ڗK?Y,Չz1Mn면 JFϏ Z[Ѩr U+}Z[EߢjE +gQJPH̦ 8 )E=;Q4i~[ϙܲ d2J{6QlfbCv;n=C^8el&ǏsVin-fu\,+zSp!{ KeKzUf)+v}9qjoE'R->Y'أyV'׺D-7h '='`3 ];(P9PyWܪCi khU$F;"9l#>o~n[Mzřpcfn` qT4d8kh䓇#֊HH /ЩdXN{  5okI|dB$VIWV5\N^/g,4չAK+]2逫:Ak7_TA  wYL!q.afe%8T ?Z`Dž 1"-Hڐ v<CrYQaWql$kh8*U(^PL.g𵿢lҢWi7>In: q+ȨtH퇪\i.g"UFXPb~G"5hc^T3#4챁ekh89I ˀ&xP~{|Y6h#_9 Eޤ>4F'fqVNؐ;%:QPjţF"Wϝ[j$gJY[K[_;J.[pHǥM#~vWN9l:M扩z2f+x0ۏr.|׆7i^ΆE^jo-@Sg[f̨^8>G>ێ _ُ?nBRr/}9&yL~Z;|FL+|LѨ{ǷO,jq!QG ky|Yfʭ^SYg~=|[w^uxǶ#e{MaO"6KV@[(2+ǒtMqdyKТAcB;P&Or2'Qir&8c1+0sR.h0160Ăzg۬֘ڌOoShXcsTy`L_ߍBRu/tvP*T/AqR#6p. LZdkU4:YyLbr6 뙛R77'>1ձ~dA<&dE>t\!qmN%W@uok/s-L!l{[-)nIiךqlތ*ͱ@X)A3Qm ;ݢREnGwojuװonXXi8५[n*Zi=%(}hn H\=tts`ybaJ-nUZSzyܒ\9 /o#Iק~PE/Ң{iq, `ߙ4hj#m0ו_|TJA LB6Mu"D@Ò0}d\8d}t+cVHn"ʴH1b+97$el+5вm%7-{e _%4q6@/J5AitaG(}Oiu |ד DQoam-"* :QuA:? E@z}j8DUjEL;-?G0??؄b\A6sҝvDt92M2JD*"EYg%%б3"vITӽŀR爲䈺AX~CpVXk12,^I[V@;U)J7+ $5oH<.2D/jE*Ai1y3B9IHjfx( _^xݺGЏ-͸Fƹ[eUsv9N)jE1ǗW.gӷe=UP5FZl@NӝFwz״VaXq$E|tGwRaѝʝƘ ɝy5yBjpGwztS}w.zQ,r!,6ͩxx|iIAuo~_Mv bEg=m#:\m khT6yhZ#`3(NFBBG:zDk ( VW8zcm /+6Rہ+ %XpBi #^;p"eO;F껿Aɕ_Ө%Ψ,xwNi{pzIJ㛑;bqEo C=eJb&/T@ I2Lp_a5[P!\L>MM lUaQ-x9=l;`>%-QFt =%JXKȁԍq0s.TGNIp9zG^L8gȿsX_G`T !ty4rGG@jIcw47><>GyMѦs+b}U*Fhsv c*t/zN+>tP{~3 1Ϥ2Ӭ x0Mرa=$0}0 ƈo15#bpMM0R(V`uc]GismG Q3pDyݔb3A>C)JNfKv_,IIŧo#"6 zoJo0n ̄ѵIrk=OEMd1Wdd߼tPG闻OY\ޚ6yn@geȝA1}ƷFAoӷ@I2N6,qdxcVqM'p{Vze%YG-SS46 )w} KJ}E2RIZ*8ٻGn$W˰G0H60 0;02̺TT]ݽAJJI)T]4 j)Hq3noJCKo2kj;7]K&7DtjCx^drM;9]abuq`JezE7!|Vf'=e^zუ&t*&':'u|o>hoV{GT[ЎK)Zu}pT}J>V:>L60puMl4ZT/wVmTh)$ۧC ()ގ _ zPq=ju m;vp; {.lY jäؗdF]_Xs^Vw> id?ِo8Ipn[^b~Ğ~NKT:eX! @V+9.hc+T}"57\<2%qJPJ@o8_%:YQ/j#E;1R[o%Ph$rf)C\nf˹1D9Pتhy{]|@+kiy"&T(@wll7 uwnQոY㧶}=9퐺!_J2fyLgw ɏk >~w0=evQ,Rܺl sTy|t §:U1_!`n%0)È(:KIZ$jM|O2EETơV-ǹ%zuXj-elIU/Wtq,JU @IEhKEgy4<6 & itۤ!,.hjO3~BW7|pBוfN+6-;=[OW 8Hi%Մ`"I`ܞCT_*w񜕴+౨*[~$:Vwh'VL[<.vquu%mGZ#n&ߤr5BgkǁÕ;?l/ܗՏԀBN8~X/lMe0Ű64I{|GdQ9/64X )ru!4ڴLbwA#ϜE[ԛs{"'6jП)\ю%kyZSP,ECuhf%j!-ktc P8ru/18,h΋F+Swѡ>qe=V:1a{CMO2Ex+()-fr8yq/"䮆f7i0L`&8H^iyKl.V0s:qH[i=$RRGJ3LH{,V*cpc]^%ν;,\@֠p%3vnZBBo":íOo1í7 ыLpK+6%WO2tpkmP:9lERƈVqi_qg-8._F}+{^[;l$b[a]B2ZsAb>3 " 2jo*#îwhg]H;k6۶ES.î v> }g ]@Byb/vf)HԾa f=l>F*% X\ߐ2io Ƀ;!6 hnUyRd,KQ8'?)a#Rٷ O1҂rAoY6P+6i&d6Ax*C4zgsF^jT"Mo.[unVC6e6;qj!?%w4*#ZI5۱j7j(oERnf;NuV$-6cðD5f MTOcm΀ESrl&HfDŽmkyn#D}ۆLgY=`^Օy:Vϒ/Lip$J+GQ6+ ,ŦYK>lPR }9~ie.C{ id[G4xB{x.A>zKU~<MishF o{?3j(j%rg۟D9zvWןW\'Р>Ǵe?'qe9tZ@aG0&VޘA:]~ʁC^ 򖹛V-it+Hic=PG _~+rƪό@< `9u5ѩhl-.1; sjk~3J3KdCTvh>'979+[3 /7}lph\UJl g66eYd쮡Rׂvjܚ r`Μy63Xwd`J:KÝ.PrϜyxl,C5,L?F$y(!\A%4к`T2B㙁]^#e5c;v+7ɜ쵆.,\7R'6ߋgoO|q1)C53&~[;S]/:co!=9S1vu\cm 1UqnyFsL5Tw>{Uړ[{38nnZϜf|uiVXfdΚ[ mxI:NU=#ZH i$F{Tp53tBA&zX l7˓CWZ;ʳ;[ v|N$u sH yH^8^qkؚ1I2϶X_x,bDс;ok6t 4_X:I"ځW@<~PCkh.L)BCFJaE[[R+։MaJ9ƒc, 8cy1bcX$XXMs1cٮ9ԧ?ۀ pf XB`[*X4Usj'%hڝAn6HժV$6I5 Ì1 84X↱~GkkY^QW0U,T5+X%Ǫ a-4Z#zyK.u3[%eۢuhھ^ /jum/A.<\]" 8Mpl Ea<}'KhM  Yۺ鶧nbC q䫅ԭ(q&]KG\J-4>xs,Gt Vk:?QK@VN.[Y;%<Үߌi؂ K^UZѽ!A.kObI7ʪB1^*mo,4o 5)„&,RmQ6RiNGmḿj^× ]\2f ;qn*Nj%K+}M#+Ҍ0c(=t%BcǜU.OTkHpY@-!X$:{Tw'MGBIqoIB$am5:f/s{!6b9d1Es /\(sVN٥2zalw AZ2#Eɔၴ}] q~:7yxanf@=^xV A㎐fbwG3t%zS3Õg3_9XʙM_}Nc0hP|/e}y,H!:h='}8H%W.с%Mn}eݩ2յ+ؤ:#UѫRf`"J-DJ9FQ {#*d%b݌B%Jj^. Shni*;jk'ƌb?FXt)"wGԟo_.\ߨL?.~d=mW[r?c蟱\X.g\tuɱ9^ *(וQuQB NH!Q*:=Wq7W_Pè>AQyZ:uZLr=Q5F/͙?$x>=;sʷwmmH~1/E s 4DMJd}/m]nwd[LؖȺW!/rl,1TD%T8q4ɍVj<\%j1zWA`UZ*#. 2\V6jcm6QJu SZ! XtoZ.sC]ɜ TdͭȪ9`8]׾d7c4P3W+$SKӷo:teFV N.BnƟ_tt$DTbX8a1 10oz /Hϧ!گuJS(5zу$ )Gb L#^..7c7v1So .'7[fO$=c =( 5WSHp^([TT>΀Mc'':d39@ڮr!zż4iu:eSu2cXggI*h?1T[BW:Ve)lGyHnjFfzڥ*Y&9`f BtPs=݌q96 L=:WG#Gc!{#ދdjXMyD&i5y7/=ć<3:!wdUL_g1-;Y>Y+x_,o.8ZCn7a&iϠJPɡb=IWtO|>m|d3Έ<.H|Zan2;2M 7d&7d&^fb'ZCr#,Z/ULx[&+Ʈwa GG1DdU"kꪊRccߜr,$G /fnl6n:  *z'I R>w ޞ3SkP=v7,yѤB6i`@reo :PB>ahHW/wWKE<Ѕ*BT1ͅ0g暥JzC1V^~?E-=Ypm!.Oh%OFY ,}$m@Hռf`2A=V~gN٭D|=9=c$\\6SZV uR`wT^2'UŧѤ䥌Kʕd>HPGv4ݖר1kџ|z!{ };O Q>u( ) C6Diʒ&z_ʛ1h67˟+b>Ls~C?GEiqgQ3wi=ͱ9ZJ- IW(L&A*m#sW˶p9Ujr00`Y䗩o6v~j~Ĕ;|g"-O@[щ= Xe*`je!j6?5)c7!8l~m7Ea vz ;Txg']nf+es|G]P n+q}Hn#Myٴcx߂jD´,Qyɘ??3~.k 6?RnE! 1oMd.d GF*6zxԳzuke-nEhԵm-c<ˍ7Lں wMSK8Ejʳ_YJ;t.-'[ۺ=F~Z;[QhoV(4'; m0BE(wly XG{fv)Y%v>jܥsVEw>#c i]}fRd Z a..hdV7mL|fы䎿جѼR 5C-K"IT#{+cL0$ggMl6m:n;~4;lf8FpoA:!FwA\(m^rIR+֒+ȱouϱ?\1Zc;h[8880:6x3ߢ5Y C$k8|R@i(X #DŅP`,YnFZ'6!zj{y21VBEt:m6C<zM<=܋e* qgy+#X4{ag x=!IU]@i+Hё@\ֽԫV}U4n$dɮȆj,`Q߷Uds[;(2XHWޖ8O,!Pcѩx7b..qb,Š8euEpG,nѠrң cWR`~ZK^PIIvz~&Q >㋸6z2Z9 n4*@D 7[=\2PRC989,m 3ȁ|zى0 \I&;gY5p3 v堤zG>T9Y! * pmF%`ܰv.ٚnGIaUU6b*`ƍPɀ6X@*5q^@d/p^ eWc[/ƭW&D.޻n_`mH:deGF"H!>x% ,ot6ο2%?~տ_lb8F.coN}fcEjNi`tg28E^K58fUV9SU>µP{$mCPgg`|hƌhG¡]DZjt 6 b/ZWT-bDmbj6 TcN]#AgTp 9%v.cA [ni&0ϕąJ&\I\ , p7N4iҟt7-,@"ņOK Bsਞ9$؛ o•MF2Gv⦑R<'0ix>CQ Nj@PPlRc*ȩci Z{K__PThSppBe4dJ\6]/oq"a]QJ XO1dZ2u-}]77׆ZJtb͕@;-JZeI|Y'ZzY5Z7_Y"P8]~Z@  Np5ZR%e5q GrWt~VN ђ':˛(G2t{m S;Ɲ?grt^Hl6t+[G5Lad! |u{\ܧ{'Y( )*TZ,>5ƚAc-J*}&ޖFRbdX Nts1^[ (H"WwILsvvlJxI J*gWKSf#zg2jjaE"MC=]}ـ I֘".y>zmnl/>]ur%@y2qHGъu&e2m 0!舦$]ԂN|СO$XdNÀg' |l. SŬFteCfot)f 40{4B_?]ё#5̿Ζ/Xg 8}F?prD5\')#3ϱ[No4]DiW{NB] &O -yP/77FmQeT7GZzvc~di,R -2K[k)8@Yf1jl^<8|Wgv t2򓍕hl{{*l`=^Њ$zri 90J^{ʌ7We*v!^oFt!cK'""}=wWn]iVN gK%[u7fVf1H^AbU dIyJhE1HAG8r B:J kp KX hm3 X qia%)X Rln}S,m)!Y *;[I:Nb)Ka;sd\*RT;"U6+kN`|5_nRTmo )TuCRV bUqj'Z>><*]ONdaB!!ӹI; zi cNcQH߄%2UN|CR9:`uM -#Oh5u"#o4bs. :tvtw?ɣ1,"70"_'Vl(AmcC`'_.t %c];.L7%qe[܌Eia94xpi@{ǕFAمay eI@ .~ֵ֑&=ڤyCV\F"8V<-C' \YEjɕ]phvҀBǎ7d'Vp>*ip}9;5yJ:i,pXpXR>^<$(Lv>)RhY\MmȡMtx{Nnrܖ9_E2ʞ:\WJLc$$eDpKB'JRVjǣY` 75j_irH^FDT1EnsFXao&pdDk\@{]s 5^f;rIIl [8`vJ7.c{Jd^+e闂R"Q[+A~9H(J׾wmZIdG9N$dbZͱ\VVL0f x1{fԂRŒK>EgFAZ)!E~I /U9 $ڳY@$MDȪ0G^HVY$Dcj\4(tGO8"Ht# +k1dtD<)!PH 2ڕaɂS^lO$Y`=\̒!#PZơ"^Q +"T010Wk S}h4 jk*E I3ʱ1e^I r! Hp R1;;?ua҆PNNj\(]Yh<"cPƕIM&n5)) }Jf[ f(34",KIaR^$ k}PŲ`a/L-,\X2hk"#7^؇Z^x-Pk8;9P̫ %2¹` fYtehmZg3xvȶVbxs[q,@1񢎂׬lc]N+jVPe4jp$uEEVmi|L :!Qh*l30Ԇr  l/a 7!5&S+'(;R7Q7oXlCE<%1zBɢbA @±0QM{F& Xj/U[KهUdQ,.6ឆnlvo _߽ٮkS07g*,,xY>ƶу̭r wӗ}x՟\ʥ ,Sɲ+7>m?:^ĭ sd[]IgiK))&%*AYJFRAGbD;x:YuQn:qy6g{&.LٕxxOQlw>|6 /X#t$=ZEϿ3?=ǫ6 yCu]r32tlޗ&OAd[!l+JQe(]1?UoVsX}WoןY6<}B"ƛibCqbFbݠҰ FK ahESsuDCuJb$[SH ͆"J!RIQN-Jt/Z[5Q oe4}y pMӄEED4egfpTIHlxUʆU CcZ+ &_DK|85 !-U fmnmjv[~t~A/mM?bޡ: =P3ޢ _{jcc0 @%g3͹1K?UEioT,܌]2OK2Tx ч4s:Ԫ!4iegZW-Ъ4 W~?mLX [}'` C>My6eZEIqLvb0:LX-7KJV2NY/)Ry!+0n.Y,u4?.Db{U"v#Dž zL:ȋEo;lzdzYY߂bE"ٴ?iyd]1h]ûY#Bca ZL4 ZZMpE1U ݑzV&hu7,wK,#KÇ>*augAuBH#P:aՐ)f.˵8^Ei.uiޫ֎ n;o.;.Q~w$:?O>t,ELyed|tqQ-WgtdDd܉U^MDPh*%ƹ";P 8*D L#7MOPR[}_8j%9Q-rKdut1(a.pC=Ý2ʌz2|aaut-|Dx4r*l<" 84B}:hՁ -b:dk*,J c>)"@.:!]^"ڀ^݁/m.י{W1P'4NyOo]F>3X_7yL+[4;`6 {ο EK V"y~5*[9B ^kK~u j[.ʙO`&FI۶eE&, \&ye$e4Z,!״TNU$R2}_˖[*4,Yse[sܳy肎64Iy*1$G7WfK-P3[(1f-:ݬߞuѧ!*raxrr !Y}e! Zfr|Z:մC $'쌇|"6]\\@ޒr<\1v͚`(s)5lxwcXʇ=Yɔec2༵\kBvN/~=䍴v -*-kq.ZMnV-_uzs%It436'?LEolB,Lle^my$ۛKn_;o|gWmIgØion!?ZU킲qg!73i$-fcL(y2U'cE^;㨾lW9ǍK*ر #n7>:[=(5FtO~9~Bkm> EXoh{=!ZwJYX0䓹~XXBũ֑aI95Եptky C8?c>4OY3cIQAyǯB-?e°F|ٛ>|=MۏwqJ1p)әOm=ӎL>>\;}|Uշ~Nѿ g5 &g E+@kxb_ eBi4X[r[NKD;nP]'I_ plbjROt4absQsYOcٔ5|RRt("2g%GI6ˬ3Z(f}_xߗϴ)` bѢfj¯bE2)̮kEȮgjFX B<+buEAmM^ XWncgVe۝ IHQ ̬k a$$d'椐yUme6Q"-Ȅ1Tb429[J+Ewl2LLjz4|QVD[h3ںojЖI:a3l+:>NҦV`V]gܱf-2NGF4g$KYP'Ya)Lg RSKBv,F-Xga[6ҭ<ңe[r#PGr4yJH FO>'r#<]!h6ifuP@PȔ43bN|pdFl܂B(I#)H(ee#Bm(ZT@l6 e]ў<7c5U"/HcBp$ f4ѹxa'{Ҋ~Ю3{pOޑ0gN8IGHZ/#9dxxu2sP*5ux ("Zg>=]ivp]Ksr+zdeq,{#9y+=h4=1Hӝ}-S(;[iI=`e3xX˖9 ;|X( Rh6VT.{lOEWXS,9~}쌶kyG[l <jЩoF;)Q[xkO;oX$`9cOc /_kl(g޻1 fj,,ZFix|\3t f۠U+v$b|͔sWrp#ǻ&1 ޻.v?êVe*3UaUN+O[n?&.7q&5خ,QjWc Oc4Mx yQez VbtSB9zJCU~6ݳ*e$r Rَe*9AKIT=J bֆFf#*.27jXpEz$YcJ/ުKY/DKM6Fk [0#踕Ф9jT{_#~#9"M=lџ?U3@2v+&<-&ZӚzEkR/-TYC%NE:| Gl,hCrI$^'[RyB/0n;L7弝o6!uWx)$"`>`yiU#Kp9N4찒f%S}'F5͒3p!x1l„}(Z26ܩ4i\gGcGtv%2іr^8«|"늅8ṰhtR'S`5ܼ"S<&lYo°ZۛkP%mH|uí04(&HҢzZBŃ "5T"L :cA zO%蘸Ao&MȘ]dm+j3_rPM>Z-]qxVH^m+gj^$Dj1en=꫒d e&Ht0 ҰJuwu3Yw9|KNZ;ZTuDdPpaɧ/׻>^[W8D[w9z K)0o[tj54D~b &G?j!1:@R6iDYJ*p° v@9pJ lÇ۬R/%mLPN :ɽ+w[Lg˧:Fv5raDPpo]v4k)uDaw_[X'DxczS |^H9ܤ [zihi.*GK0 DZ<80/ cE7 {v)DlhWsͽ|>/EԊ^$ 7z4]L3PJXOn,RZsR˱Q|G !bBVpw>ͻS޴OsOY,&GZ6Q ڰUM@e=kH |m`Y{H!8jgq9FwB.PD|3u^%w7 .ZulCP1}"^/}av;HZ: @cMde‡REF] W@BN܇uM$+WdѦ%@!0`,5~ŽIWn/c 8zD!f ^.zuM>9.Ehb}9-/X?#X!sC&%>' FJJطv˰avK5IbEX؏3̈́&c jՆʕy  dQ)$2fJL驌]qV#}ˡ|WK7j4`xsVNc U˵+aR O~ dtC%d#6<'!Tj :(ab.zҰju?+9;+X@6F !2f紛<-5缠rcL% kY:|8p׶e<؂zu7rU#+v勼ȶ+{8)tȌʭm0_]fn^ }%_}c+(P38\0V_DE-D]LGSӘwҨM]K!<%ȶDxxLˆ̴z2_;$Z,&Bk3քBv;ST|M@Eلʳ{f6;_`ʬc385-z.} e㨾nt*_ɪx2&[̹tm%=D][퉫 .RLI̮T`ltj x`s!b ݛ{훯\l,2y[t.ײEB8#z_S @SC^u(k`I,^sjR(}-{KIia^=zhG<'Jwm":t%]gkk4(x~^ۍ)>3}?q܍d"^ P5q8<M mQ2NR-)B.mL.]:||yn~.7xxPoNɇBDrxl[&3iEbx1gn,/l\wf*; in#xl𼐊̞Y/|7k=[;6xD {7}^Fe_0[~<x}jrqΊyu!~S!Ϧb|vM J9#iNA=gJ,԰6'VK~U[~Hy6e]UNy,U]V޵fm[}ahm}bN>e"IW>ߚN(7 y1 v)VAHZTu9x@]K=SY>KkjZ>tGv/Y\+!gze5ss:aŒ *J~\׳C|LbNOo6G^r~+|0WG\?ߝ:c;%?eff꭯y8<-N63X_V_{xWYm'WWF"?b㻯ukʲ18.䛱@\F8Sd_l;[Evy%j 6}U U5֜Uv^9 [&CFbj 56 9`*[[U;J밤c?8>08ʵ5O~d0{ ,a'ՎdZ){ 2.0U }nj3k.. ʜU{c(ϒ4wVI+[Q~VXVKjn(N-_C|1;*f|Rƿvݝߞ#ӓϺ!畻X_y>2EFVi_ y#T[d~I)sIN8vk 󮡵!MǡǜN#;Gb#-3# D^ڼPUȌ7}!h}'.D6R/ R!f&0;EvٷY@~v?8 1֣Mœra yO挊#^Jn9VԻm {G\Ẩ.jQ~&\gq!1hR!K.DvGdwSE{_^3Ũ_ͧm>/T/8ٗ,d~X m͋?sMtry/hUĿjǍO<ɑ??7I d]X<>,&oAXˉ[lVwK>Vl:)XR8}4LϩZH7p .#<%@6mtl^#.v7FчR1indݣqsY ~4~Ir)R!v͞wϬWshh2z"`4q{s<Ϋe ~z0>/x8 >k7Hgp\@:(z[Y͏Sy8~A o%kNHG6vۇ/ $vRGu-MRIoSW}RNA\,sT cJ,bhw{M0Y ~ڹ>Yr#v{\"S˥iK`Kew?0XIki{. &&IIl>ŝ4HB%$߃+m%Y5T&llqte5vuMiv# fGBk % bs` !Ae%7G QP)*v9(oz0G6b`JȂzz D99,v-* I!=-^b1;O_ >,_LԎ՝RUgsֆ]23 Kno8Ǯ|Iorڮ=81%eClxy^>m'+ŤTTMtz#I!0h"X-BlɊ1c *?2"x}olURĢ&jbQ8閔ZO!!d\!p>P@ٷwDT Nl68ؽ|7=ߤP/MփzǖvJ-=#Ҝe#nBz'բK%ħ -ƎZi>CSm5]'6p,_;}-!5*RMyWqz45nblz r/2)x\JEҚ|乏+笴tXk@ɘ٧8'?$雗n39ugEқY[V, =&n6 ;Mts$+GC{~qDy5+"'ئ9m5&?)5j.<)bla!UӖ+q >7$=l!Sr5̒s,vOxaG&9r/پ?'/=Wǃ^ƻSǿ-!XpPvU6G]0e>MeQ.q|xiu-"=m(rY'-;CT:t Y8̔1>CTEM6&?$-ژ423ub!X#4KJtйq7(qe/"d^geKE@F,<`]1SQ1J"㢦e6hR$5WeP_S "Ft50)3e ښ@6[DZJVySV=ɱw;*eؔ; S@Ff^=mۏoNߦzn*vj |(5Tq)VEmALhlF$*.CSu[UҘl}PPRPeb2Kg|x60ux-/ޒˏx8xH;E%/$m[X2ݯ=7SwZ{";M,Kl왗MIY$5xu27'ZǑ|D1%ВI䭁Xե)cU¥BZ,BORliN`beحMwO';KqҼʀX~22B||wL.4ę" QA=_r'0T?̯ܪca&{ǥO!|s9NYe*dLnc Hy!qmB^UCɽBzbl:s%U>K.5.,(AS<8S'UsQeU* v,#B7T4B0!b3#+OmJl([[ےtTеwZnX(#5֝s-!fW{ojFx;ܾ{6!d2UuP^+qm>i'h 2AsjS{آ[^kKJ:\h݃!FProy~Yuàkx%m I[f)EHm=FKⒶ׾^g=%WKݼ+yƀ:_Soʟ|x?z¬#Wh=r@TZA˧B\͙%p5_6\m6b'gjٻ߲ +xmԶ?׻^Kr)KTph>nhpt6K,TJu'x^BO(4x1]OoubBw91ߌu众n㮤{;+I>ᨌEc*ऒ(&j@ol A&P5("k tAbSqCUk;GF㱀_JRWc+0 zاaV*v)=ό41R(`}16]F(\bl a싱Q޼ؖ]fd]o`/0 VOo޼ܸu1r=5L{Ok|-GEëP;Yg0xS͸}*t~]DJLK9Q#␗3[`MwCG=F3Ld}S&xp; 41k n?t>nmL=y  sŸL&1uH㚤?:qPzwv~ik.q;LgGe·xUٳ{wP1UɆ}8cZ;ĒUng8X Wڵ,zk4@^]ZFD NZf<9V:{4ՕxV]JrV3~l%`t9̍-UOo x q ,=Ct6kU+}Hޕ6#"ef.A~hty` S;#K]WLRZJIy(%KW[J%`q~Um!)+bMPϼSJBZndǔ f3BsIᄑ& zTñz1pgc4$F@&Gz,g}J2*]p@dJ6;fuD9$͡i* $q6ULOddm̫m!dX9~VQQrE{EiHGSX&bdn,Ξ+nV>jveٱ`h(e3Jּf_ѧ:QQRrH(icaC8풋[ѡll!,~Ԣgf-{vNkщ0xܕHM_Z_|BFk儣W뽛*i;#7vjS,!s9Ly'e(֢ph dI"$ E~LPU$dQLzF 9, ‰|l+ Fxϥ* !*335R֗u!e&sD,P2 `g*Kl q rpY \q|yk%CxRts3 <:E(z5q)"`.gk&+RP jY(9G(Ӧ&rH;޲7T(N CҡRW{o7hto9.ZOEcڱ`&4g%)yt#=6VY=e4(R |`Eq -4Ը2ˮLV/ =RURl[^Ŗ[-SN7JnbJH~P)+fLM Hz5$ 57I:nu>T*|ٜFHc!'ht78 w;:Cǃ7g5 r!&C2τ)\V8Yd$p(pUY`|gWo߼owk ΕIDPI(m_&A<kx_RNrGflw6/]q)*QE`< dW)OKDH3{ZdžH ٗywA X&[  1sơFW Rd+HO)*g{$=[Ň+V%r[}2ψox;y(p4`>r+'Pْ1rwTxs[io$|ӷO]6{'7܋ ׌S3Nȯx{LbJL=!P9!U(i5TL%{P-cTΤ:=zb@m{XŰ?6w9 Tc'~q+=&a<~%?h_=>fxP:4rP"X֑*DT0[n`D; Ka|{>*&On39M^pF`4T#^Η&]jL(F1{cq]=ߞzc!֖YO.6GJJ0s\h15Γ|B[n oX" d*3Ea>~|n]HA^oYfwEqW8Yl@y?}iJpW]~scn1~,|1H֝{{Q٦ß^YGF>eVŅKQ rS8$a3aJ5Da"q&J(JG?EΌ7}[@?2wUDy.ћhlY..,!|K=3T)*Dź☠!nX0Y_^5o;ܥr/>fZkmyy|0-> );RƟ_w? &ko">y&]Tn\-Sߛt'0*ᆾ6SO5<7$&GMh[DZテ%;%5Pxk Q[:Xk( ϫqNǩD~ G1ywuLUx|TƆbP"/Chl3DPimSU5 LR `))nUm7$Q-d.&,^51%AYMW%^:-Th7YMjڜ{QbNNtQu)3B/dOSvdUFo|ljt7ySrDF(Eyx>'!*}BiTɻkdk %2ﳽ+ZAdW, =ZD J.zw$Eƻ\_zd6]Kc U]5*ͭ<Pt Sۆ1d10k 06Nx$6 $Ɔ)}ס9\i2w4&Y9EfKƘµdpm^pUVt4j!Rj+^,' pb^|NO O1/)楞U0dAY̬1~ϐKN7LE IiF{tôOMOL mhץ׿MnH:Mοܹ>pCud=Ҝ-ǼkO,/<aJlɺ̕1|߼akM+ M1Ќ^_?D#0vHAWl+֌pRw ^Ԋn-mIm|/w"Tvs:X2 C Q9~Ե̈S!pww4nEX]et&3F-VۇF@-D' ~^r R$J&(\.)v'Ԗ2|E,\jk5D5$n1֒@ԋLBR-P-[c̾9H\ăH>0r?~Cr zI{(.8W_WL5 4po]Q ^bk_7GIelO4yGf!fG{6oՈO/g>`c⒯(i޻פ2`O{c R5"1ucX'f+/. 9LHK"[:MYK+zݝzzA+.BF-% hT ˽d"XQ9(P1tMLhNS`E yL-o_,i?}fmnAo~ӡ7˾ZkC˾%y/魣X?MoF/294)_@&_o{L0(::졭lUMֽQ55Q:u_hboLM*-O5{ `@o(d]|^ڰ}2,7payܚgF^2^\G^tB!i di|XN%^z8-H58DbkYuؘ'ު\]8ЍQ&G>|/3AۊsNC#"Bnrr۬-{gEĕ-4oY ƒ[naW8pGCabb\ Bk)RJ/)eBvMDh EAa"0dIiDU 88gOP44j3ڋ*fb{4fBS]گLTJ JכV~%UhżjOG)@hzPN1~L4kwS>1ZqƖ8_:E/x-^_K H -lH Wa_|båfM;;FJheN+-~~@Mr۽@33p%k (YQoL6c)ZC)qxZ1JwX{ީ\\3ã(Y!IBs#dd' sj/֞=yۆ 4s>Г[DqLJ#3[ly!h9\=$ W|8U˼Depd JIpU\A6\լAOl' D2\`G*+2@ YU`G*ԯLm\1R 迧EN7O#k$o-pF39xޠY_dg!lYRq* 6f|D@4DMP^B5[1tB@mR~Pڰޒ?N(K2Uvm#AD<ȗOw%ҳAa];9F]X_r.<Lƻ=;618N5k3t Iڼ:̡d`H3T6e&n*NP=p0̀d嗖|#@΁sBOyL&JtV>ؓ,?n䉁2TM$o"C aTc4pI֏+%Ǔ-|5TȮONc4:ǪZ0 ]TBjar)79@}ロAJgYn1uI0%?o=9DɢC$9Q܄OO0Qw]u!!7tW!N o\H.ַSx%7vfd*U?ڼ{nhx59كpq!uea\lF~luްK?q=kv͜71ά4O )0Fh:{r*?l$ƛ4s4ANǜI,,nGN.392=M_[b 7/YDq5 g_1D~Uo/b+Yp`&wׯO?fd e+;c߲VPr` Q7ppLYᜊ!G cipt7=NЎĀƲ_Gk1%g`Q5oۀS71m8{Q98(*'$y[ƖزVqm -;hǁPB1H ܈e1'nڵn|JD=uhlF$hb@R) Bڭ}%ʐI~YEЃ'ۿye(bQҸx3e#:\g+$ f=2KgCiD-QTx;tN^xrWiKO}tZvprٚ/}7-Dt/vZ yr||9]/mU7 Z!$pUT" EMŏ[2{Wf/^jUP\[j[3@O(^ iX_dbΚmծpfUne{nxgI,TԚ@*]_,ZW7qZN0R=]󏛛ũm0NyY_["bNOlz \1rru uл|)ӛk]'λa+>;;k^icF ɼ:wMCI.[ܟosG Ϣ"p7)AB,c iuÒt~?w(@v]{bFΘ5IA7~ X B1 X+*vnפ`'R?) aī|ߌq۾D^M`eţ+ 7|MKo4>.ƶ3C7᛬=K̅gJg~4i/_L)Gi`0RpC;odʆi;ke?sw5#>B: Rg)Wws@Ž;yFQr]/H<# CBDk:< B3Jt{&XiCZZl(r 8'Tbܚ1l>B>ۼ/VCKWտw~3p91ۖ_Es Kʪ1Iv+'2a %J1Gy@l}U74T>+<`ڻbt3m;vX;QPm/]+-xY&LdQTD5}Õ}BWdVJ5l+SwW>^'nPMKA9.[U :a1xSYҚg,IpA \$J>3gpM4_Ӑ**]LQX pfuxbց7JdQRI,BE}R"z qEWh\-rXE % $^SJ2T(g9 ''jFcfKxZ?g.f<va1&~ϻϻj?AoAY !:! N]&fCH:( 11Yy\vSZLu~oǓ$U f}["\5.D\.QtosNDs@sSLԨfú-%i@(P"ZSx0!Eɀ2Cgڢ#K 6< '˾"(ޚ|#IG]5RR3pb"Y|!0R 4S|%.;[ nȠы"*5c]vtQWg:2p%q=~:D$:D /\1/)B0ͣ>Lx31k~ bD)C Uؙ4+s7zfs jD@9lN"vb)7u5zF/UjTKy:j}+,N1IGnHK5ӉADD H+#by]Z@J5|{ #>ӏIָh=FmfLhTUmq:րUX#Π'δT=4V]ژ:=(;]h\\Lބ#ZRvϑ"'!d~yΕ +14i<<5*8LQz!6,]̲n:3ﲟ6/kSerm?kA2رI`S~<0JMlr [Ʉ1ͳz9۪z9۪WͶhIk2TKnu` rt6%>hM< F525| n4!`qMϣ$J|7jX!pK5aba`1D i5c<T :l6<'ŦM@0Ts]5a"~lvzD27% FqP"2r%g @&J 8+xs-@W ?j2m|iNoo=0U Vf+#PfՌ@.Npf3F9 ?/sm3#<]w?䣘n9aJ#"]Sܾސ !$V h.\yK>_oMk\yZF%H=VZ,K~*JZ;,I:eyKXpOܡ^vhCWuhV]ld`1MAJ'axc)ZlH\X?N@\M~koCx{ Ro7wPOsV?|\ԐyW(Ż}IW$c {1h^(t{ Ou^@vWӟgvrvHJPB *00DA^Yvu=dGkw> A]Czjѳi4b_1:o )1{[E͂DOOOo冡U5{v[5{uO5{O5h0ۺf[юav%"uڬj9')q W(#M/ @:$) .O nkN8bo]<=ҴB=BY5q-C6+K5|]c+[q(zs<~F #EĚb8R\YkWfo͞BpǭGFC[Bz\En.-THr>r^iD%ӆE8B )!Һb^Q+vW-8iH^C%^)\c;=F2 d8ɲSIȞNo`${@>gk:9lOYˇN//4iIiVⴽteB|Y *G@#>)~M-6K-/3I"Y)\>MRHK~2سux*O^S;{]>[`P^?'ƩA[ 0DZ6T8UXLLS$KA}PPЇewyg/waqyJr9wEb9UJ >WC 2^>RIӧ;j%MBP4R $2g-2@-H*ʬwpH%*D 5ʸ˂βU`!"kZ6E.#XNr{B)]XӞ kvk.'nFdbdT9X1fbùk2eSX$1vcCƆ>J"=/R0,DdK_K_eQ ŚaiΚɡRr$;E&Q.rko@sQ{ îNIH@V.)UH/_v[}2^n$r }pc-i*SDa7`"\ (2"$oϊJNt& l2 ^)W 6"@*8qzL D7BJ-=Dnвy+ 2HnG /!*nD[zeR '-K2*$K%4EmH>e0)^gD2>|T248^zID.I :N: si^IH6OIPD2IP% uOKd௃D4TJ$TЧ, јe&V6@`2`U&&aᙴ,SRICP䜕 ȵP n.IRj[ % :eCb%bp(W+Yqa' t-YکDH0Ehv!NWzST](n LcBJ %\;^q3YɂJr.ͬ\v?㥫rwmw潝d&],%L\vÎ O{Y/C1an/w 54w|O=S ̿Iv9Ӹ7[Bt)&"iYFk7e.S!&Ē4$ެ;TOH@VgMVG,\^\`%XvQ_*EWggi2\6.^mv7qlB@MFې~0sϴwk1w٬41؃<\{:xu6\^pLƽhIJ%4v;cF)FI2DJ3A1IByA9sҟQTm}HRo\}E$2'ї!b%1]jTjpM8E[1ox#oIK8b26*B\Z`\Y$,MQ!'׈8RԺՏJgrBfUMp۶_ -Hs+5MҊV]YY+S0GD8⣩y nFreKBSЧ7q}`hgfػM巑2DEJgjkp!BaS}> ނjc1] j-NG^OcPmHڶȰ+U0vkUt4<_7Dy$j>֪ͼ5;VֻRcoqRNmj\r~^8M.1stEmkYd͚Lm1 #w] 0TDxL4*X^mykERe[1"g@ ǥ^qmC.R?>Zs}Q@)A>;,dl~0/5t~0]ݷfn -+:2ɛ-1#M~`T?߃|E1x};_/o@s^]03殀zWptLseG(b uۚxb۾ׯ,E9I߮J>9A|6I'C׈+e1}c \_zc{ mo+>œϟM=ty|yuj_˯sN9=/w!Gn0KM.Vw­/f7G&_Zo_~olr7>œ%yu^:G@_͎o{Ie Sb]Fsg^'s?9ӛ?/B1)<23-^ bw 77ݫ9w1U}U{ׯ뫓}|xN}}KgE(b´a`)JRYmW U9Ô|]RU!N2?´7e)SJ2?xL4Q2˄l0^˓KkS])`-AHԺR#:(E I{`H׿'L&.O&1$JiVYT3K3 (YoMu& b$?S*c *Xj8<ߗy@95GLuIwq{>E}n[#;Y&-BT?!>H st3oh ;=Jţ_G¡$n'XDPUh3DX0n3q7OF.QCAI4\gYdA>OUuJ׳S T(8u`񍕌^Š4-7]OD:DdLlmSD ZPnY.񔼲͵5ɱ:|8eYvf3U{S=vy8gK`kRHΥOCQCr(b83$%O*lrn4nz>|bԨ^(-4`sLL6^t ﭔMrGI򬞐_h}I2\Ϝ%IObu<+&P#E ~5o&WVZf67/rJ)dnH:j.w+ĚtQvCf]>`8qIJQ:Fƚf`VF;5D Ah1M%o b^`c^o+[1d\vB=2)mىQ坘޽zE&'6k ఻IE`VS_9I`U7'P-,MuW7Z]ߥkGB;-~Kp1-USKUrիlkWT~g D̰~mE0!Na,Ů:b,$~mu jcayb [.s]"R_sv JO#>۲gSpfBQ&KW,&>)H+t'.A4X\}%ԉøw^}ךk 1SQ1r,;8#+2nczAϘp)lP$+>mcxTdԖ5!+#,'+ԭWZXLbt`Q0VZ^*JL+J(z 9Y2+tsmc1DTIa;_')sx D1qgN/ۘIKJ>NMiW"4ϣzBŏh_/>!VMpҘ&eWDxLa%DF~zo@A (͹X͋E`->=_?r۳/?ݳsDAT*l`RscCg#(~+>z凿}7؅\?^y, {/߼7d ϋ~36cZnr xFa Cŧs3fv\ŏb͈Lxy}}ySg *kڷ〟f(|Mۗ/'o&(}xn]%c͚O3C6G3gTiEȸ7Dq_/όV.nY1+dRLSbm>%e8֜)ʲ.+X 5c# |jE#J5)ptR~?H5'zd;Н. DύIAUN3ɞIL2ILg=$/W&eN9|$SJW. -K=Q 'y=!O=L?1ma+B24ӈ bX*Q SvS`[ >:;H3|],(o/qO,Xj-Xb{f57d`H7؆X,yA FøU*<|k92Kvlsja/П{XDcAb!wvt#/su9}x /_SzL zlΌ[~2&I(.crKɥT)ƄR܃p E9) b{ͨW94|TL9NV/jM_jK"NW5{[:^,n<^;swVZ2b0#6⏳^]QUpM0vnJLa=,*!<$?Wb7 c1Ap-8RhXg6'* X?@#d6 #p%`k0j@2ς"c g3ǽMQYMg$LxPQ\5\[eC|IQ ``L(\KAEB4f`9 <x >c4,K4A `*hVhk9U$ Is +8fʨ jX*)R0n4, Ƈ[~f9ec1aFCF-QoPH)8=ۜf Q &%0L( &ȋ[%h#ѺEU7\5!G\ rjj&[?H!e*) :zWF|FwIp:ȁPR਴9OO&{ 5?s3|Oƭo?t4} ;yWp∸j4~t,3Wbr Qy0X~5=/P|«Qyk V(GjQcԎ 䠫j6b6'<չCD"(5@(CJ,'?䅏1$嘐|<>4UrRyf9" Z| `)xͨE$/*M_7 L(JiM:|V~RgU:}@'&[mjaȘlGvVըq5b“AYUOja( ?qHYpZ@@yb3Z  kԁoLYS(ccdn*d;-I36p'ImB5-&kdxvE!Hսiq2.c,[;󃺋]~Mun~~xD; W@:6:6Ddf 烿dz~ t0}`x^<;@bowt%Hߍ|F !jۃ/N n7886%mr$Z)RZ2Օ*G.*vQ1ރW|C1S-qhr"aNL^llϭqk+u'ysW|: ѫp6ftD??wf_lۗ!!aHRL4xk RɎZ5R??X=24t[xPaFlgal:ћɫri(` x*ϏD lpx՗Fo*}hQ/s}O6|G  }Q Hf<NjwV6)nKӃ!`+;RfsQ.?w%.w1E^bЪ"3U{E>-LbɃ#otwAGsVAHnbv)#Y<}pVw U%U+\x+oLfŸT]zOK)*r{%.x.j)F?dj(`\#|:*' Ƨ&Z"v0 dfbiʻ)vDŽwh]Pf `CÁ!"{ڭ^ƃ O-̄f+2\")8s2AL]X S@́]LRfuKHb%Q"1]z ΃2$(n)cb0ۜFH1&kۥ D/k&oK+qNjB(WS,/ANObqxkC7[3q ~<5CGlOrkuߟxAum:$Կ^UwH}Ne=ivr&7ט_keVzON(UɹO*q;6_j[Lmm1oLnn S?:Rcͷw) W U]6AllK\ϯ ^S_ WH^7B*mF&iԗjJ9k4*"v&PVWh_ Pl|Er.5%ît/&p'_~zeP;[?"O/-k$śoh/%ؕZ۵ n_sX SxEr)B,8ܟAF0 BJLէmjT=gOgKjkaʕ9?5lC ^j׆t s/3̻Zvh@0ukV|)ʻ+6.@nǭKJ3ϼP-72?3W-dC@ %Yv[k4|H-Q6٫VQdv_2K54Tc"F/+ n V ժ%qȌe '=HpԦLa;_* #**t!0OBеf&(+FS6t%ˋE_$"!n|DŽ>ůHp+4QGez%&'eIJp7 -pp돦<\)x&DŽ^ĝ;j#Q=%\Xec/eGIB/c%Xx3=P<+H%%Mq [^#(5T ;>it+"eD"zI};}:ҋǠ յa䷵ q.聗x.``v ځ Q@~/6u'.qY\/ż 0`%ԨEx}p.9b51bDH謥G :4ap#l+-a?W'Z;t/yݫvpKt={fbS]1'5ށ^Yw*(cG+8"n1gk Kphkc21 O4 B'7MIqA4*vJn6˹q3 Rr}ޕ1lftQ΁o]{tC®aSZ9Ɖ;~p,v.M0k#٩4L XlYq;%jBy$e"٤`Έgd*Fqsa]rW=S-T*Asb. FvZ<|JtPI+^PJ) תeA%^Gqum'xǗ|{frWMVÖ;_hAU[j] `~ '0mg0?'Њx`ВExXچܰ8cɃ`Bi؁]џߏ?AQy-cBkԷ YJltDM8Zv|A]iK;زg3#C~z|ASuc!tpJ9>pa)$`p;l289d̀osYŸo]qsĝ][/n/Wq%ī=OKå'X;C1{)˫Iaw{?}u}~=No(.Z}6_~=9K{5qǿ;졺I _6K ^Jv^*9:y`RlK缕ШѼbD:J *YkL嫓o^cBI,$s2hbQ%QOs`I-=_|9ov֞ܤ6[}m+ @.\V7',N9+T4;g8J6\Ex4"KI4j:QGr8Zr@ߌ>^^]c%62+u=;?Ϲ嬦 2ѻ ~M8^/th!Z+ǜ3_IBSu~-ٿM/Jn;sؚޭ9B+zʡtGIv\+^fQ),A'tonkRu=Ղ2 0k8&Zl~/:WPew$ 461mU$*Bk=fTUg[VUAɣl.-ngㇶhRzk"B8QJ%Yg^FiYl@ddΚdKSd5SyFkgNv%'8娴vDTCp":<6ISX3grŚBYr VT=PݽוsN˃o7l9K Hyԛ%'*&(̘D D[ތщt FByT5\s z3%v(zM!b^Qck=#Eo%:BhA0| 틲|eQ2-z/8l/wf2߾ 9vdrb8)~ }{A %p}à^a9 B84=<FGKAB_yx1PH/RŌ谹%ax0iQ*b4dL̮V<Ȝě^&%sR4*&RLH%)/Գ@~j.ҷɝgV7JYn _- 4LNHd|YDsUBṕZ҂IDKrOVI)BEIh]r.1CMJD-DEKґ!Hv.`-Hm7}@վjSHULUWQ$^ZkRY_QgKpշ 2P#@**\0N[;ե{Qxl'-7]3_vRK̗=@#_jq*7׊TabFEF(K'pK>DI{< ܑzи1bz pћ P^r\ I0 {L i\l"};Xj"v5sW32 gvYަqSf-z9INb1N'͵v-ʫ0SkU?X}EVWuIxk'T|CKZZ~=jYbъ|5=81|*u$U["oDA# }2GP>1I@L8J`{,l(?7>hF)fku7RJU UVAWffk͂)w{>䱗Y 0ʾ:sA71yq+._GP_]XLcs<*њt LbDeX!1 c:F`PrxMD#XA6Cg,ZAbv;I0< q^ڢxi-n\LL86yJOkg?4AE10 ې@#%{lHGD58۰f8,Z\L5(Z\LNv [R '5ǡ<S]4v H<-gK.`*ODi1|6Ack[Τu7eS3> 4MwO˛e9Y&;E J+-: Q]o* > oR•1gx:x}Wj3݄꧷Ugǜ&26펀M% CY+29g՗WbyqRବ=MEaK& ~L0ٞAh TڀLy#ҏڠ U)H*SU,=c< xc+lTa6c$ < =u0Kt]=aؼ$ wo@@I~:}Lr͛;7ʜܖb*v|I;S8ܶ o1ׅk LW*BZ W!GIk-@r@^U:/FZA /Py,7`RL0N rf<(V Aђ4-^]y$Whkpi ]1DS[U4y*Y˂KިvȟǢWOE^~ 1/]ݗJ1_͌T屸`5F8˰(>{tD)\)Lzx뛻ݝ1% __ή?-2ZXZ U)*&X}ˠ7S]onF}7fE P9e3/2a<},ho]9&U< #}W *RwJW+A]Y"+_%pZ3+a`Eg*ryBU!Ab1ौy3xvRf LO1{[JLo'DWNG#Njxv)d_iu 4ZV77(q拜+.1SWLx 5$󟾯Ap%d`!]< /REJ D߷^Bd쳎ų:Ra5lzD`Z]@Lj1+39HR.1Fߙ3МNJ<߮?d ?~濬\M`^D"8hgfUAHVriJzEHi Ty^Y 9A H|(+9wC _^"qhMЎ1л^sɭPJ@1Ft$TURZJ-GszM܁#wCf`_D"phMa*]uBD)%|`3FK.j*Hj,qYJG@Yґ;{Q`eO\lf\1͑DiIw9Fzo UޔR)2cPqT֣yԱ*RWv@;=\[|gr迬39OUJ%Ac-VS2*FcQE-c*{(37\s'ym@=!}uI9H*_QЂ2.ZYB#T(9)9gWT 6),ŷ+Sn1Y+O^~+Q70лƳ,+6%Paԝ*wUXab`e$ڳNIRvSF:8yMb꒑MN.8 392Q ( [U*f Z%77/υDB4P%owf>,oa\Z{cm_~c: Z(1Ǩ4wo>7:ߜ@W:)g KKNg'6b3/I ؞pO,RXws63b (o#HJ %s_3+Zr-1x:[cda#3UlsfAl!\"GN.p8l%-ށRɊkJ)\bBLFWZ0a IV&|UL}3YGWyN(Q$JK냫/+CP%q&Ė^{J R:FF&*7[6 ^S yHs U(L'/n^NfnОs_y4V mOVͯ<)__/[< ;r摠&?Bknq:ќ7){مK.I07i6^_LKʶtpv (Iw.Y27-=˃vјAL ݲ Mn]Hw.92ӶUQÃ|BːF{s>&9˧Gq4:l(%[:u>Bӹ/ pI^t$LWe9 T5%(!NYA['9֒< )֖\PƄ if`viZZh& `#_`," \lJS:Ga:iƬ9%t{xlj&~.BRZ_TESXE!| ̔[QʍRV8+knHcPfc#c^?9 $`%OovG"n=0%U_fVVVfVF dYcl? a2 bRd2SRJi^[QJ̻qLpaU'+Ш>"OieS R9ZêɪM /v0LKMY9l WX1:*Ӷe0 g"xiiAB^h>ҎuӨرGqu7n΄Z7EE8GH6jzn<1Xv3͇r,S#$T md9\4*}|f JַK&˯Uq,byM,yi 52TPiy-cr~y$a) Þ! $q!sM #$YP At<@Jե|1<7ˢ>8:>(њe$]J'^<9dT3>fY$\4_>h&kSeuݥPC@RsACUsa$? 5h lQn_ϖ`꺍;`n\O,fUimyPƀLjjp/>ߊ4˥iz2p_mC 鋁*G {GK|GzP=/#<1x⁁"/Ejv:1lYҎUbu p[`1_؝U BwY׫'zZҫ_BbhsȮ65lLdMWz./ghÇ䣵?~o0Z8H3 +~W(:L+4h_H//.@ !uܽ 7 tT=v@z!PtEan{8q_0on nt'4`!@1H31׀uс? _ !*Or8tno)/'gͱD4WXsUktf!:֐$䕋LqKB2$bbv;ܙbqD%%l'd;#QO[0NIZ2;q\}0OLJDʔ|vG8peb\\LȑH*qi S_rոT8 6 3 &gY1ХVR0 H9IGFc%.;'7lNȤT*RLR(%)5EiK3)c%e#JnrDBfgw`y-򪯸yqO37nh3i3{pbjb*.1fPR!lLFII Zk@i!SJHJJa*3Ɛa<ݘ;8z~؞5|e!<ɹwMYH+"Ǭ,, !!\D]dˎ"0k<4T!eKv,ڙ%2]C6ZEI 3'jhPpw |ɢ2#3()w2 4B]d9e!}0}>b8&;*6[Hؽ+EH4\-w`a6}ؙhhtA!W]hq+TB{ˠ y"Y=E67X).V&^۔#_>RO`dTY{~pl|y[OZO}GP.\Nnc쓊]`d0t'-NrB!]$% Ҋ"EIIŗdkb *pGT" '$vIS!NWS@QJt"v`oXq2hSP}vDF{9/ mtWp !dETSʹWO2săl.MX.eNR4tظc!ƭ!𥰢)҇{9 ϳ_D( +\8 ½In4w!AI[O&Ք-jsr zNQ'~0I\ą.% "oA8dCn#Vfv5uYD#bXWPZ}xW7ܸ-ZF'K,'啹}ԈhDt! \c Ŵi~߬x/]p$kYDID"HXuܭ3Q&Z,SYY%GzGi!`3p}@o"D=)(' #e`ކ#MHDK 2MҔDF ؊/\cga ؇~\[NM!?+iW` y49pO:<׳*SIcҿYկ9)iۏmX' @=/{7]c," q~VPM9aIpIHJIJ T6텪Ϝ\@.r }eߩjy`d D)]'3T*C:Y➛ZOW(YWmoIqEMvY2oɡ >Ys0ԩ%C?z'{jx(c}r^AVutUp6v{ b%4Gǽ7:{L%dUя糛i6m<"Ĥ%b 0*V{1f1lOY_xƐw]4.۹vU,qRD2cd,j67䕆Z&f:E\^K̫BR1 ]1Gb}:wͮq*H6zC//2VÌҸֻO)`c}_US"zc1mH(U0?e X&<-\2ĆOD z qX *eѱfNKbuO\^w_׍nl@uX8,A̋|,*dFGg#(1oܞLƽW]"͢pO4" EcD1NIۥ#]|=F9:`T £6(%X&@(9bn(Lx%mLr7\0A`:d@i o'42 $$BvwX"GTs ϱ0j7YgwQ֘n^4#)/Ha.C4&5\20\c{p[4ang+M}05,]ξT$-w_d5[Nm2+E`u17 L`tZ623Xco'F(LtxKJIKPn~` gU"cÀ[>,՟fV C㇯ kE5hEfOL(OzS >."~Y-"~\^-`$Y*,2'xV%DeU)G%)#rdtjJVbZ4cǛɇj{<)⽙ª: of CeND$Y.) Aa-54e)8Ui. Δ"ȉ=PjC>xDlC v3Y`qFY9ID ^*Y^J"ATҤ)uǞ_]]v'*$U(iA x_b~uX.!ȋ8ҍkK`6u~~Y@e_/W }z3 T~Vǡ':0хWJ=sɧ???k<_X^}v[+s;{FlMY'Gɏ?;mbz0Lޞ8LzSi~,WMߙ=pww|5}s 'oN?Ggw|dwJ^U3eWP=Scĩ<5hyLb{t[yz;1aQ賤;)H,<׹*Ct@]uf6:o:[3-؄*'!ٞE7ϭγʅB)tub6[y7XOcvlr4I>Jȧ|0Xw υ93Ij#0Yf$:e%)0%x̅9h6AJt~g#F0Qd#6mOhJ9QCC~ڞpu~#RE R8h4/߰Ӡqr&Wr*CʵzSUđt፜ǂcdqFȐ(F1vIYipXy*sP_GP >EvW K(BG5lJ\ 5PӎjF0RXpr FBm?w&W̶BE& P-ԈToti ZP-MD&iNɳ3;wo6+&NRtŸo-g|3TN`/' 3dþ _; ܳ3qQ,ryqGY`^QJL\kt_~//KZQj7;i5pB [} ^A$ۿL'YD`Q{uYnRr+5nMsHo2 +|8Rna^mJ ?rq4Aye>rƊ 祖0hK^9#ZU)B?=Lg%{^O\o40B#QZ['\HϫD1hfM}'Y%q")l `FSRV=0"11!D"=X9oS,G!q=YW2F"~H<,faY9E?LTqcrZ/;nk?LvZpezΤfZ2DXgxTxpg<hL]T3~u9dvsI9Evjse;G)EI(la.~y:Zlyl)J`X%2@Q\Ǧ(U3˄J)I«.,ɶ$.eN'`h8`qjug6\Y‡oK#cV!θ僙vc>1"I+ctVO?I?gL tr 8P=G?Ze:ƀ>2|:D=i&ѹGPo~"Bl(Ha8:\(bo- uڈYc÷S]2:9#tRVuf,)]Q`ȥw&݌)B1UKUE eԑqm+:E@%9)A5B` G&,պOG>P׉Lqj;D(ZQZ*PV$cy3|WFLSUdk"QפV%;O n]L@nKF(dKpWwC[0M~toCND6GrJ#cLB*e el.tn6!t'EOT8K(-Hfe,fJ3nKrL gm }Mm/%O h[Th zE/`6\mWJ kܬQS$ Dm*y)p;ӡS·gh7~r@Iki(gF u0r۬5ez}_,x(⮵iJPx巾jLU&:MHoԓ8^Gt[Բ"KtrE4h&d9xը3Kp0mԖmzR}Q510Ȃ ՠ2gߦv[Cʺ.kڅזcYЫ]ר|J*0n/_qouB◯Wc_(~wO%/_s)>)%YiF.7 >_0_,)2 8TQ Wg &^%0Y4-G_#_JUWP9[[Sx%KV FQ<҃3O"ViBp}u*eCW6:x84`*S@$#!u L /B" ,;Et#KÌTXic\ ^KL5!TTxY9za,ú𯉲՜uC"E Ki [.>O ^֢>S~s#,¸ י5l(X_RٝB yN." ޙhNwtsK@: $[k OL_<ʏws?Mfmj] _xnF;5ۺܽV@Jmbwp0Çvh0]mu5,@&#‡$з>$_K\?5"sգ8ꓟ(ډˠխ $JP̠?*ֵy^i k, s4u>ݯei3jZO6*T'nel g5nbVvnd }Gv\< tJ3٭y٭ y&Zʦ$}sf }Gv\tJnnUX7R6EBMsU[I r5JP[[iAs*'n6r"=H1̪4_ո8373k;O]TxWt$a"S$NNv͟aTerQ$5@J1 (tT4` P(ً@;jݥ[M$p0}nWMQ]<|l 2 Whl]٦ C`\~4\q3xvEΎ7  i)Ys"wx:y6 /nl%U df6kӼȃ<ނ VЂ6.' 2Ʃl,s&M,1ʇ%F]DTOJ$aJ<2aPJuh>L<Ё{oa<:N26:vp7AO9{g=S,‡SPilTD2+cEaҌ[$ p{$e@&S? lkh;hjMB/Oxv_+zt6 PNzA"9՜' 3 r %QJ̒ӷޮM`9uF5/ґBq5w"Fwnt:X0Mnԁ]\R%.MQZ2 _Q<5шD7yn,!Q㙟3/#=33ϯԚͬn1z?M?`M^^Ì>^]>|ui2SҴZ$fľt=Z<\ۣŽ4d]XM~ޫ=?8lWE˦w:n!yUi'D5x8c|k- Ϟʹ:,JwuYY-Ur1K +_%vYt|cܛ$uc 590R5a~ф& '˛^$$iabi G _{ʏ؄D*$ ĩaFR#8b1iHb$N)M %IlSfxz,oPU\wb!r\ACM#pQbH"E#4 %]p'W7L7$B T8CLLBa Fésp |03k(ECZ[7s%eg[pBuOċk7I{49(_%q *XmzLd>, Y҅lml]NiD~׮7syPq;Eedɲ ] ?JBV,"RR*9a )&!h11ELH(d4'P`ln/sQiec+1voI. iHퟡRHo1 glOvl3ދ'feK%r AEXwr16Y`-:ӯU^OܶSP@E I $ +L2PQHbdp?>\lqq ,~w}Ѩ݁ [b~U?2R[W%w_&FQk^|>\QiW @_\]-fcyi%J3.~s4chuIu?c da fD요9C%ڹTu> DR ax\]+VMR ]̺k-4m1f4ʥbzc/$ۏAZy$Ǎdh%G+>Z,/0 PPFMDt*uL")3Xhi8RT"GQCBZjE/FEH6 J.)rSDb٬4uMr\l;x-^"gܷ9luMW v z3 $$,BHrRp1gq^̴82&:F@xC&r`"lG2.p%r4άĆx{آ֔Kz2S4wkmx]@emN.l[[NllgĤ!T%lP }qʎlC'O1ӊ\:3-RE؀O~Eܖy]{#sN+qaX$pn/SNakϓBJ6D/@J?q!Պr+,ZI9?LH5̣#S_M)<܂)OQ;r(ŲlCؿ?uY6 vF{f zle8w6T`V(zYzF<,?I8K._~ us6$u7]b>VwTA]=ϱ 3]]Ȯоh:LR|lnk9k LҸ0W2׭|>H)w}yՁH)VH'41fl.桹-|̆SkrK%`]*4ݣY&$4v4t|RwIwuc/ XcpW./K%~mw"$ ?҈fe,m2u A"a q۷,]aؽٗiK>fwhSuO&.ϔs9;SWլc "Uh-@ ę'fG*N?SC#[e[G fj>kxq,&OQ*!Qސgk_Z.`\#} yzE,+M{de`֚ѯE\g[M&^^]z[ε);5 ^MXp1'ۣ&(L"UmwǼ;j1FRU>gT9y +=tQ'FP$b ''њsQDs>Lbo.FY$K>j Ud__~\0u@U^Lm ]r#~>T lò) vN?heO:RxKb{6;M Ow!2Z+0,h|<~'![M{6pԄp2{r0@N2n ˜;pPGg^zw+wY/Oi%*W^; `+?ތ)a3T?mQgxǕs̽ y=8DtR[rB w:I߁Cf Lf_'@rB|S6 W!Z}w4AcؽE E6ǰNidC]5,TХ-KH$wMEECaۧa:45* Dhoq z%E7碐+P3hQHyt mڢ'd(kAAe^θھuҜk#Iu~|[V%ҰR>T Vtꔄ\;,Aط[Ӿ$1A>r6:29@>SnfҖc'U-&9m+d? cyPդ.%gAҏ<) n_.2nI^rp .^GU8a8^tZ#ĕsÿ9C8&͕Vma|aڪ xz9 Sgխ?Q>r[W} THv -nLeݑK&q4PPꀛ##U{w :qBđ@͝Sv>='Z`O²T͉QK!T"v HHW@ ܚrE \ӄ:keȧ!ԆKK| c~wL$C>NԞ9%+FEJ+īdJ+oHa~TJ ; N;^;=jsj'*DPH[;ΉӏaOSSOx'ge<+gAXq\s4Ȼ9h "^te oT+"ߋ&䴠C% sCD. ̃KnJ-FO*h $>Rqι?cKaLƱ 9 a\IG칈Sj Fwk|RjfEG 0z>ji5/Ԩg-(DJ4=ƲE<'m㷝nնH 篯CT4'ftൺlx<+dRHPJA#SA Ɔx^Ogs2"$ \#sgkN-#%(ΐ4Ms(tsБG܃%5D7*N@3z}~GЂ/gˆ۸k!%*MzA# Z{Jv>E:>\i&.~PZ(:[LW |_m;[Ww\r>ׇYއ]";&r{SUmc주$~n|.{h렵 n<`sZHZ 4b:0eOf6B zi>6s?] e4pK魭8JA(~AeqjH|%5ч+  J3H0mOxZeW5&"_mG@a!c8ܥ*NcbLG;bIr8RSz :8Bbߟt2NgkVL'f[UH -iZ2o_", ΒGƄ3b[{E߄{>bSc96C‘TZ[@X955L6nKv@I);S@$wγg[U J<ޜ*\FұT_/ưҕCe)렱}=ERJAaqWe_w'|X:kCŶ O!+"f4D%>KN93>9D0t%kVjӰxl>ۃ~eq,^{;lMїjHwC ە3qb=b2)Ds20T{kq@)L9j`̑FcZiV048'[Vkm A8:a|Z'*Uz/0LtRƷJ џaREQKs|7-N7+xȖᲇ]"ޱO7q̐24te3'mWDaGEHf Ƅd$YZ3}6Yb%@a =nA_j>/6Ia4ͱl]1wUSup)r Zd]]]U]lb+,'7g1Jd\ٗvZ .4{*տ B ®&&vކm}Re98z*TW\d/A)S zҷtZ4tD_-2O~9sـ1``$<}ܟXׁII-?w( #P]¸A [O}gh|}ueA\[eD0cq88YbΥEb8aB_kJ(.~{Ï@J46\\ GsTgoH&~c]WxtUҟ`цS Ox("B{gקL Ǫpa UjZØSLڪ|-0GKPY{a ̻·#/ pJ^[go5I=$xӔLF>Z<|YUQlKxhQNpͶ_d!;Sšpx}1k)(Q 66kKp#D]3J,L[]QWӤw$fxW.xT0 TQC%`&H`dquŧl BM@o|M+-1]75L %%>ezmd/;UQ󪒣 ]oUQ XT9WEQw3%"\m U4wm4tsHWDɒ(JPd9s#ܗ8q7-PF`{)yn\bxOiɲ(;33Jl0D;.FIǝ 5dJhlQ+w d4HaW-=f=)f0/f“̨rV&j|[._B N0 xLei̓o;X]}2OqFYS,JsGa))a(Z{6< I=6jAY8@U@䥾 ,ߎ>s`1tKҔˏuB`~-%n/oGO #a:@0q̫C]o2;h_d@p{NROK\0*85us$&r6Aiy܈KJϟg53$Z2hv9IfeW+Ggzߜގ?:`%o.B-_%4NJْ~0m2^`q-?.@mgf7Z<+ o ,HDЯW&WD:#%IZ[O*6{4Y)}sw<< ^r|D <>zWhI7J\а_ Pyvv3m+*H,f/dָF1Hi8iOjR C^+rMK.kKY>Y uc7UֹY_6墇w_j?7߫3#~1錧{W ;ϋp?Zv)q ߠA~] /ǫtqNxs橌6*őLL' Sn_kdR06R(q%5  @l)t"91:sTZf=ѸgEFG\ށ]RZKyDO.w02~'pV|۽5sW&**2[P@UIs?  i\!N'p Ό# ע9ߏ>3"ɅSSE A$kUJFzUAkRLa7Ĝl\w{{.?J,ΦJm\27mNwLI`N'mwDZPF#?&2BHM"NOwG>GY*, 0SH>f%_XD3?{*AaV'xRl%0MIdF@dlo=ϩ_AT%Ъ;|Kj|<l7> (UiRCՇIE,z3L6ђ-;|?-\'cQplm {HuBiI&jb]fmfp?eɴ W[MqV\L\KUl҃Mv;?cwi>ZS/=]|>E_o;3׽$}3" ?O(uCd{!Qs)`>V{C c#Pҹ>7"+  oj sޒS68`cLh닰Wɐv2T͉ւ4&IjN%)Xi)S1<&QpT:</cQ} J8"T%2LLHk5 4 f_ yPmKbk{aXbI~|'Y͛k/WǫUײd1drYמyd!-߫f Y=InT~ f ]Ll$DMqb"m\b r ҨmϜtmL6rv,}OQx膓W_n%Me)5m4ge$AyKHUVXuHCL8ЍK6ůKpcj10FMMDEUv.f1kf!9tXG2C0ќ2XiG0D3*jfͩO\zt1*2E iiDceQ`!=&S‘TI&u/p56, fF̆TJB\*DHjp&rFRY)bKDq;˭H03?e cSJqGi`X"2ct%zRIS." z FuΪ՝2_CO91/V;byUy_7)$=}}[ ?tpr%(j^u;i: MQo|_}C3^Qh=`#H+?3+SOWDcϾad9*n>SpqYY@ |U<FJ`< )\@uڲ2͟gU u5qx*Spr&2Jd",R AQI"djOߧʹ쏾]?G+l$21XH T/f6L`+dИ77OvUQ%Pe1Q5Gy*bkDbRÓ̦ t f,sӈKA9RLD±.M`i"PBHT#-(SE2{FeRD͈IP*i{jev_Q A&A\Z%hfH|*R#=)Da][*SmE*"KF8ˢ!$`3y\0O111I6C(eIIB &IΦ)EF'$ !Fb6v dD8oV06G A1 L h} G\:ŎAhk41gT' 7 TSr aUd= bƧ:+H}q9#YT,<N$>Wy,!b1B\H8{CɟkssŢd6Riͧ}zYyd:WX2S)XZ53ꙐF,h|<&ȖքyI0\ϓp`<" hq뭣B"{)u(h l1|gJRI+ýֵ!sQ ]Z^7+%kTG<#GUIpk$ 'yȕaTG<#GHҍܚkV#;O?ceƆ!}z:w UIEkJ}1\i8cC#Oit8GƄ7uÿTZ" qz3֊7sL+/T}Kz~$.T'gh]k ",IIpR(ў8 %ZX8g uNJ#H(fL1eMO8?pS+VòW̃M2_O^^[)n*X!| gv *U{G#puES!G ru*u?'(MșrG'=/wyK:`u8--{|I]n=lYI<8VPZ6'04cRYB?N5:vkπ<>4#sj_l#_AecT&$y^J|NnH Yk6]MEXX%vWteoM2A*1f`Z˟Aws((6Z0I@hfZ:&:ѯP3run^yla\诟Ckug]Y í)2]:Jt)t2Dr1Q9,*:lۤ 1$"]3lʓy!@ y+Pd8ϙz\ia]Kr^. s1d"!dIm$#[Bu\abB[["Q<$z;FXIZ+4h"[0F90ǘu-%?yAPϠI=\D/gVef d([-dQp&SL(%}RQ@RW<11I]!%?,,!Ŕ^{rIADz4)(T }R7s%/j T" bԙMaNk 1zbd< z9k߶Wi5%epoO_WsٚNϟ6uu9=ut>O3~-ce*{ |z7%{"h&Uy)F|yxf7dO;?%첨v(招ǫfc9u7X~z?] `+lw'Q` =g|]D|$P8@*tU+hV5bi?7XnX\Lg YQ`Cͤ7 'i)UJg@9sIrcyILRU!ɖK-^olA%O`)r09G|6 J)AtJCyQr2@t(+lȁj~A5`o %e|KW;sj2pٌ 9[zPBluCplIw̔.,X&)$f]RPU@R"JGr[IFr@N” OAC YHZTOj!RBeYq"jF]vdP9 :Sν21Y9zL]5%.{-_zH$$5I86X٥YMlfPPr{JA`˶ޛ}o/fK{Q)LKt<~:=nXNn]n|`{߾2snU|}w˰py5m]opcpz4+wsSR>v<'Z^Vq׼;=KgM- 1{/bgMUPX%ϲhia g -|n1̰liێ-;Gb[ZbK.xLӀq)FSnkѫXҪ؃l X-%ZTd(`ƃq󽰤cN^[ʲϪ=*Vv|#Za0σ9Ыt0 6c1 +XYҧ%}Rt(QX?)0vqX?)hM[z5^`J 8Տ߫ jn@_>9CchX 6{Y$aDRPΞlTcw'[Cܴ7]_] Q2pӳmvPWu>4Fyvq6+d5WpNjrm!䏓?Exxg $jÇiswYrvGE]$IǍ*\͏9*:ڵL,f's]Wшs={\@&D=ܚN1mqc/׉vke4.wkYjwںM'6$c:fŰا jX!-ne *P챹ە&#DaW\ʁ8yqnr曧ޜr fw. ;T?G{oZKz/n~)Ǟ)M7傴LA>z !{oθB|ld &S83D&'srO@تa~&X.k#=%ڡ/V~W!gs2TM !6@SឣYH9qOv3~N Y+kIJ<є~(,B8ӴNY٢,TGbXρ]!p+WXeJ.Z$AI+w+si  bc㗶v~Fr=7=  Oސ7Z#KSڴnz%'n!s.mc}꜒5*L%erb k{P_1YBvC'<΢ 0I[?b.Zg..|'r>O-IiH\P"`XXwu+ܔ7ڪ(vps0/>^$ff<1j䙓(< #Uݢ(023XXɦ༓3Db!iQP̜GJH ~m51S3 Ϡ+f(M1h)=P[G Ǥw!FxaNA/psQECӮ-}2vBF>;[堝˞I|BT!fz_[<LKl0m 2ByG:ؠ=i\x|Kw%ɾƚFH K !H*'*䢳­@r`JS gƻ`ּ0Z<,ţ)Xb`10BfeDeNX dJmH>H`(X.{, _U44$i0t>5d4Lhc623ƬHQB 9R`D䳑I 1&(L`s^8bFL%<ɰ6VSF@:auN׏@ԯoD@@?(?&'"$Hd0 juOEA0|2?5v_bKcGRpNf+HjM%(`vY;񇦏?Ũ܎ 70wmmVK˸\JٞT2;Kfp5%8=`SR"E4Bʑ&l|87|vȾ-w_ _zBje6.ip3r+wǺSʘŷgٗjag;YͿ+^gؒ1M89yIltv>ڳ4<'qkF~ᅑ)PbA `!ԿY{`Fsy@sA49SLT}olJʐIow bsbS+^nn Feѫ#̡0i+&~jȵ:-C5vYDLt"ʤ8NK9>h,>: ST^R#dίzU(uOH-ӵ[3S du@A^QAan8٘pA+s)DJK`*0#u_Xd[=p6fƳ0Ϯîӫn%SdU9䧍aݤ 9.l >}w1$]:cq|%"$O8ˠ[5q  ? ~ecW4nlL/+^>j%[8f%7Aw涺x{/?j qfi>6rA~vOUi5_\wӜԏa@#nj&Å%֞L"75p`clv +؞͇0ﮯnkmmK ﲇw:8^=;ق',ld'*~lk}iNZaxu]T[?ӳ^Qz=^¶uӣt5*+zFj UkõGV^R;.ɭ@m>[ˊ,ƒag`bhwⶪuã+v ^bGe"LCop:L]EY`汭W1x3?[G(vC v^|kh8Zn_pZ>_t!<]>%J4-p_KMIw~V_;cƴrזS¶+Gʿ:)%g_>RZ<8/XOt$|bF[F[<.Z9-e-W\l G~>"~^x0xy3՝"40Y_og'K_|.&;MӀV;Hk?[D܎mԒ=/q}x>'ͮt+ٯ!Opws >m)գzVO#|? ?2 ;'pz96s$WZsɳOvlz-~-}QlJY/y [ZIzjC:O3^_]T~>ä;WbFH)#zXoͭ@ɭ;B>$p?YٻF&5ہnmm囟~@a0<U|b.`԰t13թ3bN];4旹l2G.)uE[{Y?.ԪuO5U]uT}e׃F?, i$^s[|Oo콓Gî{ձ&0(b#8;K_r7OtU /ŧg ++ * O|rE8HJlBe-mPLw٠4Mjzھmu~:\>OK7 i,=8D="g޳u(ܞHfeYC$5˧O~=`0~f? 1ի#7ĤbR>p]-sLe$JPCҔZ qID=X7+f!1N+L "zJ7 qd a"4.@ pKuCbHbWИH;DIho ٶPZ+d8˓{R<@tPGChXˠ~zI,3K eå j1za7u4ޜQ~η/5H's:4I7{TR q-բ^ 'JEE{`}^ p*EOq4s`m@erD(Xűd~ń+n,706q/pI(\4#+đr0"WlX%㑒!DA [#h'7CqFCe^_s5d#\-Y2˧'{#?[_y z -W7ߟ,:0Z_>ŇK|0Y^56fJ\/*sjXxQ`4UKr27偷Sapua{2XB9*j2 Bj%:#:%s^ Kh7 Ln(]pq 0EpmE)=?lӈmۂA!p*a) "wEB>d#e)F䊀o o^n(3-EneE+j`F޷X:YlNBpDm| ܣ @1{v]i>nΊڻsJP= ]^ťn8Yl--yWB3*P)28( M |xE6nB6,8͞1j5NpNfcH `,p IKԂJ/Yk1Mc'e!<),T HzHjƈcRɳ>5!&y2J:K%:L;(6''$DtT&$MZRZIeGD)%̅s4騅5vWu0{hԦYj-z3_˃FW,٥\5-$/me]d#ɘ#mR㸤ѶC aNt0*%,{sdz|WgٮGR[(bt7co:0r<^r}F1: *pLU5ѮD(rTk)Qi+ie!jN28嵠" niפyiV/+*{Է\m<"=/>jh_=>դ:q6h2C}U Ѡ5'ԏn@e6ѰL'Kח@SWX\RQG YN.n,$(85,Qj ^'N& g5A @($+/ bA셡XRJD'dy7 @Dc^jS2&G-[E Xbm  UIg/RbGg[˻7 d<@\ۻgr4|WݹcG/a>4=J]}jI-Kmȷ?D.mKMc~V]m?ɾu\K(ObB}oQ.2SGummэ3<g<'=(BtFb |00kp{VFH0ւν9w(*B.ai}] 'T}]B!yv!1Ʃ=g[jF(c?bӤl);g]`-8E/q3TLɗm PҶ@ Hw_؉gܠUwHo Iуdc8#xD hcC օVRi4 d?G}%JBe\Nj,q,~t膢ʌ1odos7bm14`PpK9sjuN*TQPɬ7-+5YJZm QFOϲ\g%Q&[mic-~Co%m魿Ye>i m%b &yRbb@y|[Q[LXCk%mi$Lt$jij%!8GRדl cu1z4 RQ7:uGL9>'?J%J*ʾ0h݊p&p&@Y)gx#(VL"33dWy %@ѥd N)X@%iBc* +VWAG* TVHjpH8]0+DP"H$:dh J;-B، 0^QHkRFX}y@]8%y;#7hJ1@Ȟ>~?{WȑJA/3L)C@?̸ۆۘۆıԐڍA,RR"*f]XGRuGfFdVE8G/|iYl~w.vݵ]ݵU v6Q,3&HpRQ.A]Rkp`Y<0oq> &їl6- S;:ϖoSf 1nXaˢ3[\G.5 b6ՄZxA**n|3!*b)Nq\CbA3ˈ!D+)y`+Cګ9R]M+FF!J B "1M.pp `X9 -rÏ:N’W:1.N@#̰NC9 a)8L.NQ'wJBRy;A?}ܻFD6PMe!>1ıR_CWZ *,Tn5_Zq nzЎB =-ͱS-)E6Q²DRs=-P:QJȩ?YiXЦaBG`K{Io?m'0/, /-xѣQ&1E6jJ=3Z=^q?mM,9,My bj t3ޔPs>hP0=2ّurо}vl 3GYL~r|#o!Xg4#wYf*^ X#%mt( WB' X p^8y^FkV)#@oS :p0,`"`5I #+bʴTBjWQd쾤 (m:Pv;ci9f[ô6 i`rARD]5dݳQ/d#t-(f.o4ANoon f0[N-~8)$yut)}33(7B( MCg3_^W=<^D*k'V۽ѭ( O=EGpF LN鞡ԙBÝi䉂nm>cr軭v=twOoTm'd'i/yKt[Y 1`F@(?1J]$ T2.W;LeqTuTKR*y-^&ZޖiNհ,6ORj];yl&6w :i@G#|"Q)h푣#}Q:LbE>X= Yq!+~xs%潃9 JHb yr+zbe)lR$+~$YQI2*mM X}'B ɫ# b(}uİWA g"EEJC3^-l\1Z+tJjj2`r SүLJJPj,0{̓^)‚eQ~P`-WX*KzX$6 ~hz'l)={/N?M6_oR<noٟgkKdqEŶD_fO&xi-?nlwc/ w@e?ꇸvX6cCVkB!{^^_6eqHZ[6;h`)i74_UǪA+tkFUr(ޅۙ_TyKƻtigB= np{pmgf"g57>hnp O:(2նrf[f{hyR޺fmTK.]msu)jt_ -} rI~?1ːaH{@r7i`nƐ suW=7Git0jbܭ}={kcR9'[{U=4s;[I{sz*6O/s>t9^E$ l`o/ՇD O+CBH`0 FGq2ڶ hH. 2]Gt}R* R4#e jnZ$AUJ`hD(+U(\C}z4:gm>v|)]*XN[>Ni{eQa]:˹eyV~*snMvAeMio4l$FRD&_]~ _aDn8_wNaD*2 +Q W[/.w޽iČoxomh6*k%R|SFx62RF!lȀz^ tJͫ ŲTwĞ;tRG 9UINR2*# HtG>;T_ UI6j_DųkcS}VVaCd󮝂Ľp#4;j5! Z "#^TVK*$+Cۿ-Jl'rnAklz0Nz9k^"j=q训}+Yvlp*]T[$]{&HUbdKCDwum|(_Er"2,=Sڍi"eV wRRU v殱D:.~&,_ZUp0i.rD.~k d9r3XmRW%J5d[+x1-~v\jKٜDqUmIrw}6lor!ְ:j\} ܰX5 QMT1%vrF>.%i9v;S?1j/R$ocSm2$mchÞ^d4CQ!'AqLۋEU{lZTՆ8kY1cr;o]+,[Ǭ](c jUmMoU#IpG~rǔ*1mzOuWlVv{9m [p4%1^wF[m%2 +<܍Z,0X> aRkr_G)fs3q8Ck&Yۉ;~ti2Ml:S)Kt\Bx0[6$>2'?~|QVY??_q`xOE"R|)(Sj)XIмPc[,\`r-}yYV.oTRJ' U 2qHmezeyJyen2\x͡ox 7(U:Y]5X eӛK~Q'p${ގj'o/eanIPmFS/xIR䮯JD#I & i&J) v%J'KM"C|bEF Ӗj,j9gba%6k86Yα͛"A)cxj :f !Pr(e9 鱄M.Zc u 蠄{>sJ`"$.(g\ ׆@*dc ʕ qBAce$,)n4"`Vj7 (y[s*˝JV)_R/YWis%+4JdM?I2jiZ͹4ךkaL 0AB@Qae-h-@܅[waM{)V0w?Z~μq;] ޳PÞXT$gL+|[- /xkg a-8žÊ{w;6s3.Z0V2V\?_+HG(u{ԌZN6vZpVjZ3V1mά.@8Fkf#Yf5jF`X30Ayo F!Q6HI.5# {l,S7m3`4`=&ZҸ!2 A*# Ps%ҠQ3XzV9x#1 L܃%!ކ  ~pCBb : XA$ +AIP8^/9P-zj0J!\S0ޭ4[$\ޫ'a6n'q.gusٓxdvcGqkMtm2~kMc |89_?ҝ;/RX6YnŸ֭*@;nNj7L Em*ZֺՁ hLiDO>^n+~֭"JA;nG[?Fg*gZr)qkrM1r ֭*@;n[g2ՇݪqKZ:r)]~;յ5h!sDVeSXc@YU%[pV'[nSnUyP:U߱u;^6ʴPoVв֭\8D+`J#N`ͺ,p֭*@;nGíU&u!JkV@q| ֭*@;nǫ2=Vв֭\8D` c~֍듰nUyPjɶ;*^dZ3BUBKgj@ȅC=5놩R'QR߱UEAnU -]RB.U0)=]QXSԵש Tѳs B75M t7Q+z٦^VMP $+Ǜ "uDëCsj+MlELIun[`kGM}DL1rn=RSbgg F-mZ5AhtnD<49&\&h.1B%S>bdcnr̵jݹr0IJ䘛s1ƢkLއQKscUgcꫛscSr̘phrjus̘jBc3`J1c1TcnrujǂQs*scU cf&k嘅ּ179:5A"/Ǭ8M1ש J]9fMhrMNMД_ ,qcnr5j<1$Ucnrjڵs!scS0r cnrjr̄kr̗cL9fBM1 $x(-2Nn"qk8 x׉q$Vw$|!w[_ܷf ZO3XʌgIW\z46w2P4H16^`K@ d9c@h 8.L0c*'0?[`ݝ:XH 1h2Yk 6mKXBT+p,,T:- =DD2382LdSN6> X8j=04  y) ml_5CIrPr5A*}(ppL3XC 8SBzbmŝ!` beH ZUd!g߀- ,2 HT1 A0+H@6c%AJ8Te1Of^DdhXcAoǹ*"'t Fg^20qQi`oX[RƬNc1C3D]ǘfr|6NuCƝ2O߽ꎨī6DV[gCe0S#"8" n^0m ~kʰ0^fgEӑW-3_O~NRxBQJ`Uih*N'^~<B1Tr8N{q2ىUpc:Vv4LRkvTm-`̏ުcZ4B҈blR7ȆG)U.ya,IUOȏPyR6gwS.FV.W%}wMgIʕLbcLxU ק# trwۺL;Wǯ!fU{oz%;L^}`.7ǓrM!U&C$d7yx[m4WGgXgYε0)aL*Ô]cmagUkŏl!$:hvcZTxΏh W7x;u8=oITk[V2W홧&/F,2彇 o[wN}e3'tx1W*Q),hWJpm%&})Ē82>,tv,66''/#?Hn2 ,Á9Q q:X  H6`똱 KDX( ,1ëADɇ_ߴ'snApՂoqIz)3߻=9~p} X;y m;t+㟷g9J?+-=ܬ$CrwG2v8CSA]&rmīe0`'5{ܻܕw;$'D2OU8ʩR9z7,̏fYMh2l^d;oz5W*ʈ<>/vY} /voc\'(6JqU-eWǁq t\m_4]כ=epc$mcΖ ƂKN0p)E|x^85ptBc9rc6D#-a ewR_Z``& कHA.Vb=kcMSZ!wjk1%LXŅׁ{aVsO*;ŨoVGMµ>Y%3%܋q*R%m,eϝ )>|e"w)@aJbN1:ۊŕ>7"nAq΄,=Yt{HgZ0 c2ClI@&ny s-Z +Tu\Qٛȶe>fV0Zj UIRTr۸ .ظC6Y0F襍: 6>pWoF1$۸YgLNhCog,9K]﯎awny̞}fzSrwbhL{:us]q~%im ^")s8DoaƩsيeَ*_]-Vvqn%6J]btF.دo~Z}쯅 d*&6= ,;y Y(cK&k[-dA06eLdܥH,*UfF%'B< Oڔy6e@TɕCj`Fd؋ڌ &3'!ǙsṶHqmLŦ,bsn?^3`%wf$B;30,OL/5o?.JVDœB̚"ͻAx츣d97xES6~Σw :dwJ) ߅@\r> =>?|" 4-,`PŐRTKQN0uï7YHa64!2Kg|\bQol8rZB' F)I?FÉ>9kLJ6}*ر\ǔ3xOrn4zOlI3tkwk!%H1c;GP } _)y,$T@TzW)+" cd 'A05k7!R#JL\A(ɝw۱@yV}Z1Q/ULTU4aNW afͤPxfX*D5UAe)$ (Ol% "XX1w0#\88ӆ̌k bcae[@q~1]`+YBU[(l =0ب K!Of3 C8$>Nhȼ!σm;Hz1`U钜m%8 i{<ӫ p(g S%Q{cޗNZ8B,׬lӮ4m M} .~n_aAXZAIᶖ[KfW\Q֒m$,Y_\^[pbQyϞݢlG>2{t`J::7Bֽ5bx*SW8ř߬d7U&rj;{nRc6HƫVr>` =Mj˰ܬ>+pbIRq7!ƽXN{_H[{ )|GZ{ɾҌ€Zvwpʏ x[d_9,b礵I&eFT S >}+/\}v6HEGL@ 9.'N{l䳻|4dtp*'>ʘmzO: {l ؍ =_MM'.]FIC=5ͦ#"gHGEEEa:u)H$|\{dwem$I|ٝ-:Cfb̀"$m}#T,V%ʔv#ȸ22*b`0YH F@K }',GDvtKuLze=1/oBӲ7h}}L5c9+')!¨ p)3tQ$xgYȳkQ+Qr'#x ~f%n}4@2-~nv XlG+sJбtTi#ALDRdp4")PB֢{&j ˪(R3']Y>*M o殓لp=~{i9 6ʻໟG銦>ADUO~z>< swj V! Ƿ{VBR<櫢#a1 L7SiGvLJJݜBHl.]D!ZBqosI]mRe|s%gs>3>mOGnVh 6UVJԉ-}h0 N1:Eh$EPO! NF#~=؛_ A26a9'f8~P(:Xת=[ހ̂}'+ s ]-ȕYA2WMeqx[|a[X+qlE8NYOǮpRNwYefN7!×ތ2.v_F=!Kbˑ%H M! YB&BbOQrz­uV)!EWJX`vemDfo ZH))$9:0'YsjݾڮY)$FCfSgIr~pa;Uk'0ebugv 5Lb A[DՆ7!vثZ',Ŋk՝FljQ^Sr#e-Ϳu\9v|:WӖM sxL6wwwE{*٩B_vc-<)s=Ͽ5r=}fII_f~YhҶ"o\p\QE}.b Z)JZ[7+q#m"JNjy+(>֘_cJIg]r է߼X-PP/`rCle/An *I${ڞݕowcָExAۉDzc@Nm @v*(m/_`'iǒs3{*xݞZA[L%lE75M\(ҹx#E>lysHX=(W;*;~);~j `q<*d(iPĸaVJਠ(K)k6Q ڦϔ˾bGXWjgqx~aD7/x\|z/\}4(E>wqj kPŚCYZCɠԴv-\p.>Q@/k&;N8љh0g{t٥܋)$82? !_ uRp1Wՙ\ i <㷑: )㕋D;u`e 6L0nêVMBg\4 Ls:Ȁ3M]RdH !5L԰jH]/,(%r<  (S SYC N[B~g~ʾ,aG.olbDܢY5T^ )P0La  *C'H1k90|΃xN8ƭV*LyP(DA{b1$vԈv~4C2t;zznJ-]@Kq#v G$(gQ{0 xLY9t"jGplBA_gYj@OO@ai>ӥ~e8tb )+˧{2 G(P߯ZidJs5#ޮW-0 W?nSE>\P=KL>< ǢC]}glfMFwd0i98u/؀*XܷBnsE]m[+kL3ѓ_|6S.&Y=z0RR777_Ȍ{sw[*xR5i3X%R/M*[O(?[6/M线"CoAS}{ݗR ̃S!+ JlmR Lfug~ 1 D havG8#kjܟXrbaŗVj2%-ףT6 iGfjQB*LXjYb]P  83ށG0w[‚ľrO [L- =?L Cfa&@hjӵ0OC*4a!)Tjc1ܯBGcgٴENw~+<us0S`! 7#5Lj@HT0M_ @7ۋsjC tZp54 P l^kYԄ 0<ɳr+|7Mg@DjgB#E^ D fȰ'#L8u('e$HQ$XDtLaH M0l$S*py Hثd#hva-a0TQ)uR1T8֑%5\kcO çRYW1\A+E(F.^I%eh?U}3Nw@lD@bM'ݜh)ŶV?`3`!m6',$1Q#F%mwI`E7P Sr!q7 jT)~NJP(cNLZ0ڮ,P!U%f$ĢI]bD%6#HMXEw7`DZ>%#WXL% {NJPB0y,m\ĥV)mF6qt5YoIM 8cmr<&I9FǸzC`J%vi`FUx63?,;z`1&] M$d>]l P [)P|` 㖑2 ɀ`R]`-g<"N;aCDn-GNTB.RRd%U`GSbr\*:[$ض[SDro.IB mnj4Z( lHpBfsxNד쉶d`!%6TTHA[G_EgGΒ+ܙZfq'2[rdje ^cPx[ߖݏXKfެ'dHl{*eK7ݚ/a~?N !`!ޘ geƚ;ιϚSPTh/%c+ʴ Gavd)h&2R%D Y`]`vW$Qr.2";!yƻ6HE@{x`s¢">%ȼud"4jo(cA!rZ^gQΪ_L&~ "[LDc}yHp쐫jRrrrrX`x3σ6"cqGa K*b.Q5:ܴY"+YuMvs$F6a .Wb0$D;hԠQIS-s[Bc׷9Hw(M116"H­+q83wAJCz*6, ϳbT+ua|$z?j.hNȠw(jMcgVX3##~d/*mNX = âm 5ĜhO.Eؿ_x_Nt{a u.&6z3A}r @{ ߥ|"iq"ja),t)|ẗ́܃u*5qQ)rKk-TX%&EA#%#?ZQ~ڀ0 s: sZNHG8ĴTȠ"FnL0EDZ${c}0AduIrt*)L&z 4#FJ*8*8a0L2HGЫc:ãq=P.S`%=Mta+$Vk  4:3Z QvW XBf+,}#gp*e{jsIqKR#iC=u_K %K`[phC|4cDdkZP wuu O2 ,n|`*HT3ٗH^a-P CWg|AbױMjP]_ 4a:.z:VF(z"e4&ZFI4ῌ6#Jng1?z"#FZ +\dCOF/~`'9Z.OL"jZMHtT?(J8Ύ-]I-s 7n?9nO{$=l{C_el7n9pox ωJWCUp4$ӉU&DBt_鞋CW +Hn:L"tJ a<^h:%"įJd?kH)Ҭc~ZKAzK4]@8ߨVÇoRƘɕ5Zrmtgdǰ + CC g:_ZQwGjZy q:@Bo йşԡQhA诽u+nVwccf&Wh 703%4 އsNVS<:)W!y,S%gƎtװ!HT bX'u*ҭ?iJLEJ&8䅳hO)1xyW!CnİNU[ ix-btkC^8)g]|) ӻݎnfQx-.ޞNzO8]d.'\QE;Φ ,2e .LO5&'<T\vu톊_ *" BLWwz$5RD\RLC,ێ!Gu !ƫAqj6pwAE%tX壃WOwNh*1(MA=յFŠj 0M}u*+##諘B~0BBUa(Pa15/I2k|' +1FuLjϚ2Kne$?kYR/FR1:Qb?4^ȋ&8䅳hO)3xM# }S1JR^ҴTrn!{ p gRZ;MRu-Itho[,nMp g<7:u *1뽻ǨT^75HwRjAu;iqC`BѺ(T\D@&D2͡ݨ2|z~3E)Ƿ]<& ct\ :ֹcM8MDO~8D92^'ϕ\;U:+19$9*0KGFem\Yhz_.a6⇨Ú rsb`{=UNG8IY*ypjAJUHtbv7_?p*=xb_ ,{|'ԴqBa_>h5Lm0`w&>" c r+y-.Ө'j 0h[~v㞢jT\aGcok~wYv{T Wzs`vۮzp{휀{!ixpTrn<ܛܘS\0!`ttӝSs4JFQE`[3J}-f[ؚ!&UW؍Y ":='Wwm@@GNVa`1IҵU\h c"rWYZ?O(Ds|>Lp$`AՏlFv,u5?ГYG$J2+LhLW8F gxz2 \p MDc_ALyJfrT^gy1/3=S |)eԊhOJEGm']kȤ&8䅳hO)Xs1 JrLyJh:s9)![ T.3*r<^d2bTh 4!K=;ʮU䶠Axls'vw^2(gK!R0+TD=%7z(jZ"`rѓD0}D5Sjm$%z=t=$)ŁE)$QAK%IK߻}s4OkTx%[%rVM} Jq}FU?vVK%'3R bU\􀴇 ڔ2Wӗ#nf#'+nSOW b2zn 8ۻivdxŝ8_dw_߮>_؄iz6Gz]>7;~Sc%C{cn\a_X$;.Emߟ/qtC^'!:DxބRTcՉï2i-**Rh.^SCႉwg-r!ao~/Y ?%({޲ 2%R`|˹3*9̗ox&K+ V( BӏNP55N/}'hIQ_]a|Ъ 9MW-שp_NX7kWU{] !AL.76}W.kh?M OBx_XA ֪P@ ^dO TI[Xp慆>!@H[:5Œܖ]5@&M1XA|a SfB, fQ9TPgȨqFRZp0 0rFJ<~-`RLj ψWPXZlTa-aNzOu4lNU*Z Ҕ3m%MSMGl8P{rܟ'ԍ |;OFC_>#WMߞ'Wf,󾻾tqDf @GWrr^u߇Z/? "׳>8虜~~7]:<sgfpb~tF"3#:0o=%,D&9$^E6_xOe*fZଳblt(ag7ژbnQPACj.|zyO)u~y_A u4K0;ĿsXs3{[7֥udY<6lӒVcM,JjToϝ ߃Y J=g :ixHio@HLSo!}~qsL&/ef77tNgjA 1,q\yˏE̗yPuxq|*Y>6y ΫZ@Z pԯ0 ?iJH`@BLJ)F/0mxmNɶR+³MRB9i lfx$U UXF-|W!) Ĵz+0Bڱ ;KpJ@d/cJX+ˮj[lK[mT ay=rFz\sֶGƭ0ƈº?VЕWz?{WȍB/@6n$/klLa_fBjR(bARMGnQETK 3$ 8$:#HK_qNxN_g4d dI.Ҫ7Ks4NJv9:_$>5R \LiO*JNhSb7hK:~ɗu54v=/ڧv΋|IKdnE?pΔ8 >qɕ$M-@qFW?/U3GBވ]p|>M]L4!oӄM9!޴^"h82 ^yQ)[ٔm_ԊG&ƛ$=^e/V^}N=T*ښe 3zU;p3v'Z*lPK%e:D|0`\3'6j@ BQ䵁?%]89'3i\$+ lW:%!jU ?>%ތY/TuWߢ_N簸U$V ,Ǜ7oZ;:mϷl u:D+QŻR2tMR$43Gf͂hB%?BѣF@N #:EYE,iwR(6;3xp/҃Ϟs9Gx/q^QxVzpή0{zp-n U }>=vr.3Riͣ _.1h+8}qcB@2B%'J]v8^j\КzK?|[v뢕ԍb'.dFcuafRs#."G+m'[ m;}hm`JNFvZVr'M-IZ@V.,00iRgnd.K-:c* cD) %5| 5c(]DR;- mM2CQPv%| wR"dz+"{o'y3٧S9vM魒w>ڔ@qiJ³O~󳝼so~ }}pyM|XLҚx`c;|ERܳ&u&eF^ٹ# ˆ9Jʀ!v2` A=yT0lQX#Pz,RQzfm!S C&H8=f(NZ uv̄R1E ֜c|Il6g0ᔩ2fXa&xQH4nQ72K=&Z]};d b! C5ui!נyP2]sB+sJ9cg!qyoAk$?4,w9 `/W\1n;~-3 ynj~pRj=\C>;aZ.FW69>/w΅E Pj:?>ԾyGQ&*$; 2AfyWL$hphۀF/:w[(@2s».t]>%{Y7 ?5>z-eh@ҙxeW!P E 98$۷y6WRQ7$KMN{VS84-㨪PX<:`u9=zm 'ķ*@UP<rp5JFcX. *gSCT6X%?\jIUөwRjPf`SGy"jsE8|3` FqS?9?2{k ~}g, }Lԡxx#O'P˕gnC2WERf-uFڧ>>@1s?Y&~u4B!fLNNޓlX7GYGnm1:mQbݎ$ fZw4׺u1B!fLiN>\nF#Z (nG\ENp[Z.F>Dc nS18mULh_qPMq䧓w"k ߦo{/Kn]tYZsdK=eA{4ӭrE-Iۏ,OU?Opշ=;޽X$[$k ץp>KFxm2 r u |#tˉËcm)2^^Hi<}q٢}Vy-H4SFlix͝;SчӇK Fr=Ea5Ӵ<\fCdr1c c@O+(K+AǪ-Rj_Y+afn< ]إLՇVɻB'D DI!N **nd# [,(  2ha%"l!hOiGáiǠ?z6Sd>3oHC2D%:\;rƤ fCr`A;렯/)G'?ejdN,>^V&_ԵEZil7xMSrv\v.+&¬ٳA]-lck^} ,Т^, >L-PVg\I{_iM"1 |;lE׼VaCZAt&|oVOS)G ;q.0Z׽\4_~9EN3H1{%$yh/kgUG_`[WP'2 $S&AF8ؔ\F<Фk+}9,JJѸ C PyCr8T#̘grv7eOe}bm3^67sB9_!j }W}y#)r@)7|S]:tDBK{xr]%q\y@!S|wF7plD TU7~yUMD39eWǻ4vS= c>Fj)jڠZj4Jkzo $WfZ7y&gSu+s߼ \5 Ϝư[|H~_ʽ+;e#t||jGEhø:_C.KySg:h}eu9wcvХ*}ršH1J)9^MmZ8m?KmGsB]C56׭$FuCq۰M.z+RH;bsx`TkEw% 0Cϧ#ug/qrB7Gų\ 0( =9(eV^DyJΡtN B2܀31Tr\(RFZpԬS)Le(? Qrۧg$iA˶K.4gpVG(E [ulmUV6mUsJ]Q:VԩT>HMS+b,zʠ΅Weö/j5۾"vbjʖOVJƖOԑbQm%lA jZ:Q9*D-1(݂[>=#b^a(yO@ Ȩ_XTQDEpo["#vCy]ݨO-=ۥ N hPFVjyinibSYmT`W3+a<j?`bjH7VRLЂ@Jcػ]7-}3>b(o?~z{%Od]q#N|uzsEM{;}"[?6UwJʮg7mQҦ&3&)'L^&J0T[0`i"'':_\wrv@3\g,+N CjD$,VUQ\Q ƑBAc2d116TJGŕY1$~A7aLU0lP-1 1 $X]QW:%cX\8'Af@;"fF_ߚ{K.8@A'%)s1 H(h6 @Fg=N$,MhyU48s1hlXAu,H_ATI穉₋TJa(ؕ&aO,hgFSW!%m 0'݊9l &kZs\wʻU*%ScvNKXJO8;FT*y7rP`I 7#I/yEf=,,bó'BmˋYb Lf*82#";Z 4G#.;v;%2ôF8Sˮ@P}*9{ն3[6(؍ǡFڕP[l Z0N}@DJ"rP۹L7-REi13 bdNX@ 7dhĴ̆(e+:CkKv fTL90dϢ(8T&i%Zi 쬦Q&)H gn{mV)>C& >A,)3D`xI!y ^ M O0̴dP*dbv%]y+%]FW>-ZAУgREbXaf5^+p#c%XڌC2|+cřb,.5agj&JѦJ=S3A"V ^/&wxQO_* 8F;Yq%.a:r{7RQ_w\9qd?ǫWϯ- rgͲj~sYs~ӧ[=ƽz̺J%5eg{iڤyoz!\|N]W3]şKkk*/'hooϫCysl0K^lY})^ȣ6JA.'Q҆l9jI Q3&$geNm9 F)ULr˵akIZa Ť`fyjܐc*(iwCdI76@eଐ s˜vy38[8؝IϗaF>xm!Ww;x;DBi|?S̎jlZ͈NzyI>2f'?^s~TNr|%J2k[«Zi-vqI^**E)2+o&xdO="*,9}>pBb <2j$=8w=WzB`7{[fSM=}0Ȫ&˃Szw!R}lp8@W~TIAWoZ+^J_ %tUmji9P:,O'Y7Iy~//675ji[~rc;w(WI!DeNhu,&iط&UZ17J_lFCbinܥۊr?Z#iȟ\E){kM7J@#-,H }P9`P%ڱT BYzzSU ݵ-3wrJCZxw>T@(AIXzbky?G"ٖf$͸2JSێv]SEӊ)H<{[g&:t.`nN.kٺk`_|qs rK!`Kϗ5KJ=d./θ03Fe4ٲiXx.N7Z|/妧yoJڰ*7*~~9ddpz /gElX?-n>Ɛ2f*B0}Y1e A')ݼX,9It . kj| k⺄I]$W"`Wߋ[[}=7?8gWe_vz&y326JG:lPlrV)/'cŒg; dg :)3e\δTf*Os'drž:)Yeb+00#~1&C;[)Nvkڋ"QT>S$ps)-^KsƓ~owr윈JE!PZnt7/mJEǔEN߷O 1 &hxYP5g@=` lR&37ު}Vr摁FyX[PJxS=֪Fe1EǼIP`Z'^YdbojpϪoZ=73tU hĬJv\+ͽfZd2eU2wTBE%}zşŁ F|S*(xid̰ۊms ge4!cFtT^ƄN֙9@>RgA}Uy_W}y_52PyO!pWX~{r&(ITX-r j:Ӝ[eOw1,3'-F`Fe_zLjVcgKMgSW=nH=9,o7m anW!SJE>j&bvڮEAdبucM&x憞sZC$ā Sн| ~ s{m?}*=awj'kƏX\z̓pތh'ou7xi?h-D㲏76xiÇo?Hz. 9KoUeЌ[ɘo㞡n$ XZYIT;JK{0%6a-3LH9Ѵ| `69qbsfk3rc+$ 0,W*g4=Px6--7RUFxJ/ ?~f ?i}e3GtnfA.YRN\~Z܉PRkl9 B2Ї[i;"` aK=N@K4n%TλMLR$ei]pAnꆻCON,(6!7:R' IČOAFY_Y鿎zo]S֮_ܱ9x~qQ_L2 3ˀvOev^\8X9\\;\5G=[ӳ;uވ߼>B8!ן.hbFQn7WW#KZwk' JNP:yqvV6#+ :f;no|AE6*&q<8LuJ,dD@F>{AQ 'ǃ_ f8w櫐|N}x"4Q)7iHsLVy$S): \xP*k!DYĈ.0eNb 2>$d\2C&'b'SJ^sQ`=sZ ?HE;Vq8WD :S[i U6F&H<;aCY-u8ra7 ,B95ce "T a K"C!82r^>FaRV>[2@Ҿ;[Cd6!uW-FHn=gF2lJr]zq #khPM]1ܴ߲R gNba/iG7cDUnPb)Te4^=󺔻~8ȁ㄃.z_'T/j\;bI1](-'],=#)0onɦ,Y6`}([hͧV*U$yˆ\]p&~!* @h ouYⰅq8wPq"XZ  HB1@FYX&r;^&0Ơn>~ʋ0.'[=65@]:Nv8?y|ux!eU}cOb)f3jl;BQ:'5 kwhֈf޽SO$#@(g5$` \p?Ƭ}4b~`8:o[˂61BFI#_n8&|*2|=bp4ɦ-ȶ[YjJF0'E>|z! d$]$owl fqo'4,]m oo$u'hM>:vG S}`'ti2{]=Ν p2L7z"-D"q.'q;`-{ڝ\^-',-bo= .__E Dr,mwb *q|xmcemLaK%;Ap/NBNqk'#9TeDŌrH*i^*CluZv}D.Ñ7,AbLlim95k.X֠Vb+rǀZvvKpKeT_̺ӫc͡4YD8 1az=A0m]R-a Å\ ɲ ɍXb`R9s)&&yàY#ps)';@> Ρidv9AD2 GӛA r]M͢-e9RiՙC=,>~H'mH.Y Q{w7M[Nuv;:ƈB ғs @6=l<JDc/-b++`tCi)FBu;XgXgbRbr lf l&.5ENNon{|"+TL_1Ҟ4 Gن1Vpj [QCRu SS7n{ȹ3cLz_1Dp^Htté=`.z]IUu)eq7^ j^Z5My^<|糖~կwy$!ia2=~2D @H.n.|t?=T~a,f&Y.&FmGUc=O5Iփ{$.GZ:(??45U׫N>nY3h *'찫(t+ ]&Nɀj6>&W 4wC"*( :;.24 Aeu&~B`Kڤ2ccAP&U$ y7;!#c:k̺EfG,D~|O/i-I|2 ٙ!w^d pȵ̬jc֠@bu&0t04(IkI|{nO{ǵ=fM$,МymcENW[w ?lhH֦Ў<^S|MaݛkU7}lkCt&®{c%#tƒiybjтxIiҹwmV󖒿!LFP`'MR7[b;JXu7c= :2 J`YIjITR ǠLLæĨѥ^8zvpucR2sR2)x՘̞4fD%ɞ;ۓKH ٯ6U"{b4Q3Vb8VTdq6qS,hF!\.pqtr |KZ`>Jo2;t~QLhQCƲ0BNvB7)[DO=%PJ7U"sY}Q/sD64p|q3Dˑ@畧rHbOpORypğ5^) n,(YS ZZO@鯟^ҡ0$s!>a+1?T>fO=4s>H #k* n1$p8!.9} w/N᤻뉰"}DH55u]ycrҵ\.SE8䋥ɱg$ߨf:̪LEr]Z cH|&06K p@bg7УV^'0ƨ/R!ƈKKçA-vclQO/4¨K/\U"giBKb1_6cQJK:η),/@IqEF_ѡM@_B-W{`'t@k;=ݏ8 m1&H?IaX+%KsJJy]M;rvuY< SZ_YbNe霵23ȵUe t8݇j|>< 9U>tc/9/|I)Ыx dNjQLQE]1D}@]E{> 9Xúr<-oSnue|e+{{m;ID懬i~|z gWiwT1HM (JgXۭa[~ GI>vy Q~&jGwAhgݸ7"ҢB }v֝WlDz"]~Q-"/wb $Oa4sdoL:C|KMXL^sr Dѳ׀KH>O$1 T(Bܜwkm"Q"]dX+G+wH2-$*N#*0= =@XII3U?S h(q4h8n 5l 6 5i_}1z"Es=dS+42?NeȐk<+-*pv{X(C?T0]<8 .Gk "x 1B}A\>K&(U[׋@i2oM*&yA^T&Oʭߣfe,.?2_Gk=ݚ Y7;` e.ϫԭ\\ȑjPY#BK&[ף6?$R2oP fjdU[wz1Vf.hˌ2-f\x%)Ri}6pJ0R\>d oKS,oOdH `׈..M(ܴj?񨾩Չ ;ٿ;<FHYdq\-T a%ܻ}?]E&bZT;3pst׵-몒ƸhFc\m"FW1}-0&7OOOtZ 0 D:C4R`yn!MT0-F^nR/!YFIvF*<-tleFi6"b0\MHZQ`hrf79>U)>G2}ܶ r5ѭySr{g(sel{,q1n[MZp7Z@o5NL6xg?֍nJcXeղc*k*q~'@2q c}/F%1ʫU2P! J*]~ueZaV-jGQ*<۩J5\G VUJ>խ.;*7nX>[/3s̃9riV~ST/7zuc[$jQcf-'ִQW`?Jw] !H.l4:Yġ5!![Tod~gJM>ǽtIZs(y@,8;_]M~?_ FbK#, 6IH(R.(_s9#?Zn ܗY`M3qQy-I# >&nGJ("%sd-e%XCr lL'3a>+pGl$#W(Nۑ_w>:ǣ_HsX_fMsz9Z;͘G5dqr4 Y΅G9_T.d"3=W}%t` 'uz3C#8C+kUcU?׵Kjsʁ:)ϔՁHxk 2%P_o!DViY:T1:tA:t!U8NFr@ATrs,B*Cm鈆PU( e^9/DHkӂWUUWs:KЁIwѢbz>9VuŘ5];9d3[].u -iҭKy^|MV8 p' Pm ɱ~(-(i\wG"Bߧ5M9s\25s̓%wV5zWHzg+n]Z'JnlO$ؓ|7ZeֳQ0J,N.g-C{!:vG4W@i%n$7@"r:# !x F׳[zyɞU7592Αu)34V#*Ri`N Y9]]3E`&jH֕tEi(Jl*^'${1^Ho&Ґew=_U5#$X8bQjɴ6Vpr6wOhbW'"Hg9|ϫryo:}5bc-a?B`Ƀ&ƚ<kaRJcqϗK.K빊V A\i}#*Gѫqe4c:/U2*V(X?{FsW谁EK؍6'x`c0( 3L#\'Ud2EU;m`q/?a~6c"WnS݀;T +9O[ŕU D,MMa/ .FӋ6Mb܁3yTm94F)dOR8"4csꓥ-ZZfv֫x[['_& ͚!<2 z&0z~k6o~8\z9o`j|鏾 4uNp^l %֍V?.;Swm;Cn/}3uH Zō,ԭY|J7` $o}})xU>+ iΐ¯{0ܿd|Mlֺut={ |c/c`@Tk 5ms_n A_q-Ǎ?ub,iAfRRReYrmU˓ԛsW6F0ӈ޳BSMN+糑s1Exr3{h0GC՜FV,䐔!q}a ;AwiwwHxI V+C,sDI08-YXAR .#_Z|Z7."ˋȗc I h/ ?l#M tg '>"v".# *B?K#e'Y5c!Jjj?~\{[zuڀ->ӤZL3$2Fʌ1S@B{G\{''0Xt1F[mLg9) #L%j$(d1ֿq_dtdrCBc%rL҂"G#1,@2Tp-Ӣ7A?˹^!BLm|&ycbd1r F& -90`).y0i@LUnPQg- G%)$ Ga*B<21ZS@ة46"E.朗ByeS >z'QS?&b8`̏R!b?jgIUB•cMXa,(K,ł FcALѓY,L)8M䋔`?`n$2*!mYMl%Ji^׀W9@w ވPG´|; j"X%΍Eчd&iN)8C ,3k:,衛!jY!2X$SFhB%&U&ӦA9f/twTńAa:sY2#0XqSAHg DFFQI-EN |&"g{&) "d0XJ)si)` .-Э(Ip"J%کApYBX3E E# {\]]`dCĽ#Kc.!M_Ֆr\u#_2b CUաPUu:̯9-P$Ý@ŽXJU fw"Oz)r*U)g/̗X$k/ڞse4&88`QBu!(0#?x0Z3G ǒ!ff උNEn+rJV"mEn+r;?r; PO+QHz{/5Ah5B8cJLϼ30mijl}bv;o9Puָ&t$Go MRSkߨFG'E "wW_jwVYz$FˑQ%(''z9̥CEpAAбU}Y{]@>G:y}IT{: +Q,%xi1g\#5RUG$^NkRDEHTLb=7[%TAFǙpr1R>}aҩ|}|m'|ϽW>>?r!e2wIkBYbd1 Y`w/CwJ# E{ORTZxD+ xAR>MPqRs] үV( |0tno #yv(NCі޷)s+"5&Zs rL{24ϬE,۵e77jB ~( `d#bHQ3а^`f( CQ40pYFc̮}G#9BÇ+22'Z`mXFH3zECQ=î:GC/#⋌ܠ]8ڥӖܿB7C򙺋yu^&k9=G}xZk <;ISL=%)U)CR)X\6-QlROA"<4Fт4ZW~ @:xȬQzwZ?۬>'b4;U9݌N~p3 v_㫇w"8ǞÏX1Y *B 3? <oK۷ww~m5bp,Bt~ ŸBWw>57dav$17Ɛ|9@L c`sc0nY|_!+wfP7,<ǧ{1\e0I7\ҹhΠHd, 1֎[|,e0DiAf2໤e3Da޹AB` )?痰'~gi]N}LvV4Y p0Iq>K/сCl;dKK#͋SnRn^˩~|~v?Z}ݏS-Ѱ-Wr9'Sؐ#b3,I_ͅܟks}7]6smZ>&[=//,~WP뉂+{'x6jcQ;]Zs0 k($):L&M |fFgCsfbŁTƊ^lLT8 s!NN;'j)0zC6> ֱK^{`emUw_e \oJv?9THKL\/S$S ^i:]_0F}FOD^Aϙhk^#l2R(>rUAxwx@Agpɭ'} iJ` ix5W 7`qvLu\M4a *qE]s۴Ņ-ſ^"Qa޼T T?H=h&c2"P?$-̶e|S+]e <&lkyn u!f-c*͵ m΍-fLx^-r{y^<'c 0 r@X,l ܌faѬ4E8O3%ZRI 2As&<˻ފc^kCG`22~iû#8"̑MaIx #rP3 P`3BAFx 0.)z^G&I7}ϼ,H})7D>Im3R=_יͨnQt[Ml &Jyty^vK8j}Ot- e߾zsb Ώ6iHĈy_o(3ZR1̽?&^'ADxpjk38? .b,]LdFjΐ@Sp.UzsQq&JfqAO=-ч]BUgdX2)jnMjJ>bʶZ=<7" nmbɠ`6 F8J0TkCd^KεEÚؤ` @fge.~ iR eLC弔%2cۦԚzzvuk,ܪGpj1RrE (LXO( Q'ѥgJ}o-lqm_q5 *|^'c(wnLPzs >kMO  Y +S KB!RjFF bsAFa%Lī)<BQRPmآ\PzHz&'uiM}ވaQJlƺ\K=Tj{D*.͂:ќssQ_kzE72X8@pX!]6*jLK?t>sRXf,=ָeL$ z4^ \2i&Yy1\7D)T"K%Y,E901Y#/A% EAڝ4:-Jʵ;R[(۽U6|ͥ~O ~\i_d.?c ).9D'kDc)2?wn A(+3ӗ8H#{C4 Lo 0Oxg2?2KKBAlC]jAs"0Qo}j(,aW%^~7 3~nh:}0x~~Y}5#؉s3PwE Jz|=suVskc*/)>iTG(Z;&1My}׭Id1Cd>hՇOsL1F?,ZzguQ\q=.Dr#EXB0 &wzIcI,%"?oiB6Xb"HXty,?wv$ΥGS2$r3(D'6(Qo$~ OZ;{pWuvJ ;sctmFgcB{y4daYc.JwZl)|HAʻo,WjYc1YС2VӳiBn5h/&? ֺթlxy10͜]& )6cS,MxB{JlH|Е\19 È=^ #Fc]q"}:qX,1 Mw`aBf۫Y...n߾~u[TZ2< d)P pӈṍ/FnR\K(Yl ziKiɡzw tBPWxǒޤ^G]B9(x짙vM4"֝w=1S"`IvA)pVs뵹lD~*7Xh)ٶ`dB5'e=*mo*|yوIm:/?=θÒr0`”Bs$d^Pk1(4J&nQP ،edAOAflw:u a;ʛ^hۻH1]r Yk/hdBvNi"Y6| +{n-:"SCeQ)޹AۀzUZi셂`Te4$;w⎒x'7a"W%xM"n_ ӓ.µBI̜^N2<#N)cwȾ z-;xF&3Ezd1<;7(8 [)zAk<:{W8L/C梁5=ҍ@U߇t6yME,|JHó !!zY\o4e emSv8ֲʥrXgUJ)6jnnE8^<%c[bzO&0]صq1`nusb wr(;i.rt?gsTF݇{zp_dm ߀ Tzvv=)'@+ޏ"fۋ[<~{?G?K-5JFS8;bݲ7ɒenіԥ\:>7faw6[~y@矌bgl I{vqV79k /14_3 YtrE#1 $]TX6U[ZH޲*bY?wxYƭV1DD.z+W'X-HFFf9])sSq\܌Y߽qi]zi_ƈpT=re6ri㫲6./cq`\_&9qʅєLhaSFcS&,Rk.5ZM'2vy{v;t(K9y`5xʊ8V1jH7\xIO $J&K.faZY*W ~i$#S` /:c0zE'r*J]y{Z %&27.uJw6[1 zNoߠȰjpn^T1lbI:rFVYtÝȾ BR=ȝ?9* 5/DrM ^jHɡ[jdJkosq ~Nu-2 # Ai H{C@a!tZݤLi:wLePqV|rwd;i #1{cDI*p-RIrO?%bLpɾ'7auR' Da=p%ѭ^yͦf`0$*t1*H!!Pjdzm4W@EiD~R+ s^M.Ǘy98 LF؞%V5<H\Tq`}M}ꐏw3zǗȋP%NdmwGw#Nh'> уoJ. kx/`\*bgO}]R߽o=@YdT5hȗB|4&L(޶yd` FYc%ޛ Nmے=~a8S>*PJ I[R "m|#bPv0Y}mdL <\RެTȰydG J#EZ!h;ٱVVdP77S<+̘9@f?SZ$wY>]&c5T(g&5GGcgL+P5a{Q1ǐ(a^\E{IgO1Bb]P0c"Dx',DSs~cRlu{!- * x-~Ϸ!!{uzd+)bV*ME{W{vc@.2yNͨȆމ7466 mUSFXiD?[Q8%SU}X3{W_:O}x=!+!Hs=j #&ƩY"񉸮lpY 6=`1_5&=J;!^Hc"{{A=|oվmqe6^06lcXIy^F6؃4w| 5^g!lX{F̆70yWeO׷9EJLF_n/-CbBP1<ڂ7` XFX"Cn |oTXըKw /j;r'^T7qV+:<ؠ48mϔ9XCtC!ψ9Y6 ;f[C)ܯSn%UN ?FgD 0~D~c0Ky0!鰏F1a!qO@"FM!B{kn oKh`40eQHjABC{-gq@lDˍD;bLUo" #+cR ~vKp )UF]53@AJ2R YHy'lGKk|ݪ9$0Q`^omW%@'wϺ

3d T!)8Rq@e?OJ$f)"Kq$9B8ryzz=z0岭v_ƪb>֋-;ud& U@d SX^Y=f^z{]rۻ;oq=21;ۗcH&i HY a #0rSw=񢻶!Xbbkw (by"" U,pJ`h~rXaKu46EW==&yrV%-R<6ZshQ~a!*3ivEW**'2ST\)-u%rq>i6^6j:Z{bD!cqoJ62JKsTܘҤf4O>~X].UQaD dNV bnXAyM0'z\qTS|:wݳ}Ϸ_5d(*re,ݛ~/=ŗ"DQдϐFRDBU\dYZni5ij{)ſS`'ӇFT*I&va4m UQWݗ`&!:T3a HJ̄sl;e<4Ibd%( eBB4Sa0׊Z)ɖNJ%\"g`ZM>/J*,Av؞!lClaoIqG%v2 j9Fx?Mȃ#c9 a t dPn"Xz:1:vҚ}"`ߕ3\CCvؕ-QR=6u.è$`% @\uUQ7zXYP4TetV κJWS6 xGGP6zEAqX'@12V!vtnn" H's`W\C8 _Z!q.1ފ,tTۑ&&”k9kH#kiCxHӶ0EzaT kH HG7*ٛM'rM{C'$mNJ75\qE|9˂ ! փ"s \ PZZ"Kf-%/|:\ (~QTxVa-&3IȚ/|hMpVm\ I@m$,A_iQ (m@kD:W7u<2Ch P`E"Rτ8àb7b0&:-!v'vGjFܰ#^k{%2EzE}=~Bu: =H{1@'<}灋gzřwSyFTN1Fbkeh0wo` e:0WH66-`i&Ѳ!`ofOڌt ܷ#-"Xh >#Lu~96WZ®څD QLµXWٓ\H݋i֙ ?(xb̤ CMo i R["K!Nx[<#޼7{a/_*~wQhg.)Wm,GVxs\W=\s!e۸"E|? If#I_VIv$%x$EIӠfws!Ǵ ԥ!.St-¢ktBnZUד,R-=;F1s9ǎ;X QTXL>\iAkZxzx~H";o!ժQ\q @^<ҋ`Ф 2pu^mb{z=Fuo֩PP#xJ,1S9%_[9ScL ͝2\gVrPsz~a.BN@ÓD Uodz~yԙG-|^?mwڋZkYe1UkvA'8R䩙x4pߔ$с0A')AqYV}vpwN]}|rlWɡ|{-z1a*ۃˢ@ UE{pUrLTILw|ӽ)/&ƚu`dv R{!ټtI"ų.߂XO&+472@sDF6;^z@l ]2+ JwAN-GDJ)!Op({ye%FTh)2$3Ԉݴl0ZFL7] ϡ~3vdEi'cW6Muzb.Ď?S >"0EviT} $b8Muemr.f0+J]L&&[n &MY>*lGOFPA 'L{oOM;\fe}(aN(֜;:$2]R202$8ό^P ]Z|ee0̌P6˧g4DSN+A4Q(ՏIGhp fIr Y~Q2UGM?Aۥl⍫GφpLR{ן< .Bg=n}/g=={R %U:x{B}'Kx*eOOS}Y0$;ਨ%URP&*0BeA._+~a2KsggfL v J|*ef6K§ŽZ^xYz_'E?~U| :]d`QōXNiPLP.+bSg6GaP>vH'yWc%eg^eXT²VUKٟ6WLN@QɪQn*r~>z7P3x݆nLҿyE?Wo?,J7$XVއ˓8zK 硳X:_GP0 VsI燕WOبB@t~^]=$=VwŋЖ30 Dًqʹl7F hxgs ^9^ˢM(Q)7i(sH(B@@>LuFK8t9XP*lW4b0;z}M/|<?&7゙8;e݊/aƞ_y_^ =Ski>&Ry/ף|1iE]_ž+2K|,go&$ϙu6w~p]N?2eA1Dfsk2FfCJM+vZ0` qzmc&FÆ)AL ^i O? ]9jMp57nGRE¾[0o %"1I}CD=H@[8SYtLeљʪT F]VoZ; {,|[D!oBs!}VLf{1oٳAR x=&G4d-^)ȿ{!#/$NKZb,^ԩޡw_ww;ںw${vmiNWA011gXWַk}}|73v6>8؆nN2CW9{- 5+}cp4 p1 (1QZKsIq9;$5,J_g n;co;6jj7@`0ZHPev{꽋j%%*ە?1|V[y8wŮ1$c쀨Wº eu-Cy j1.ӖRw@V" 9o#V261i7S kZ&1׹Ԝ\wsw8]L`IwhD88ȱØ@GVF S>'GɽiAzGӤ@ޓu\T>;Y>1jA.#WgY4"84g9jGר|mVm;/ɊҨhNSrP\_twā.u=qB%1 G;!(9DŽ"31[4i^oPݴFZi /쾅jKaTjunwk'`RzU֑` )eP7LJR8ݤ./ :4Jzĝ:`jU@rmZP۴w_9{PY$VM#"HCܨ^7)KYROUszL1 ba ^lTC4ᠭԓ˙db& ށnXѳFb=aFOi d f-j[ +<8BY#d8Bn'n؃cg/{=҆j8ճKUÌF}UrP&ƶ*:)TlC bPJbPZubP-%öJxmu݁:*d˨.s=`Xx$1Ė_0VGH0z`BRZ<߻9)AXn{dÇϞd2ex{5E6L1^3x_𲁁 >ɽtsj$Pw~NMkͿOh΍D1\#EPY$V@o~n'Oۭrov=,^gL_'`}u[y˸g1ƱUB8 c8W ks.&,'07.HhM1EEX>UuT|Z~YOsP\F}YB_SE/9bP `lytuQq9f"'a|8 ""C4[6~#͓v+Oynyc ۭ`ɱD2(kk |,1s(!ذ8E\G~;F]._&gL!5vw4:mPK!kdkj!,!lF|@S"@&ǠR>1 yL_r 0RW\`+mq1C@^{k I @VOx>ڜ-=r '`"_h2cߛ'l枈yJbAߞ̋M9ݨjݩl]OaCMk<#Ԗjb0Xqv`b:Sq)5C4}jNW("|2 ͂i9+~mb1jRc0]z&l99ibm;+ ҉W0A؛ðX,b߶cLpӜ}S PJF !֙)oUiט`{ܖR?!z}) e2! %aJ:؉ɨ hvL2k!u~^I/ iK*K*jIh%D;DWFi\BLXWb*th52zš,X:M99ߵ\+`frپ037 7IVD$qup2IW =ȝWZ5-̬ӹbh#G = 3P[(9VZ֠S 5JsBP3FXIZL܄JV'm?l46M_Mr M1:#rLƴ6H M\Rnѭ]47q\n.mB+LҶt| :-t.BFBfX`j8fY-ic*e|d$'}!L)ł"V&Iu a ̎brw9M%#9R1Eb:!y$I@ >} PpM[hdl`Nܐ ɓ@Byȧ1/6Vd{;g0f!%ƣcd~k') 3sg,x$g܀V59ܫ LqE 0:ImL 7)D}k','|I{rle r"7Nl,׸ N, T^M6=23$36g=d3B NYp $qP&"ZX$XFN( @oR$3⍲ 6zE[#T֓%<|ɖ` i.@@[*XTNiDspCT10Ieҩ ˁ L  s4] tI4Y]#X\j]QA IֻDbN49ҲKDlDJ4+ށ8b* eTt4sJ7R(Gh{̜ 5ZΔZ29Bnb,dEw>;\꺐{OV2C`nGۑ$5ͦʴ" MimЈT('22SdBkV*SWRJl*,G2@=BHK+t } oݢ4r$FQJ4fYm̩ xcrigoыObYyjnj5 n M;'@ԹW}Eb:|gv-f6?}jzZA+)C-&usz& .=qŷiRwȸu5вɹ5~ȓ1_>]_L.bnmKW&i~^'Tz";*75ܸgF(}~{c^ʯhiL/V5suy[FV?A^^ۘ >ys|u RsXwyCR4X|jT~qOyz-ۢHT6C\qLZ5ʄDrhpfaWieO݈@vyئ|&[j¡58xF1=Ծ/O֬t<\f3Zj‹SqOM-cXˊ!=j\ќĕjzy=!CnPѠmmǺlV}9~)z8O"^)Gof-}47"-%6?M[-' (훓R- :xzsL6}[Ba7L1ġD&51LޢkW<|X|q~.(?QwDήOO:i5G>Nb)4SH3=b~dE+XuiԒ|5 +6V`!+w ];bt!R媟*me_ي+ .l4Ѳl _n򕚑b_rka_g *̷>$Z٭~Ox}LkSm1;M^(q;ZjݺmLGucv&]{u:%܄%8~ E7B}Dp;eE'ҽA}K 2 j=עcMafG[0IuyKSO!^\; o"  7ܙ_ ]H oR-FhZnZeȟ EIs/w=V2O0ǡ 0*/ Ϋ~Ī@d'qo̬ʶ55o~T5k{Br?)7.~{9KLB=҈Ckq@\#5Td/umCZ뗠ì{܍OyX=IP=f$Q%PmGpn4bC`q @^c ecIq׮^mOu/zz3龜QI+ f宮9ԭAcp)> S;s6v䅆]3؆Y'Wc|@ܗ)J[&|AS.-sǿ$<=v^<u;7:v}s冩~([kVNjr+ 20{5:1az8(y-\|!'#P=_c:C{4vm7T*QԶۑyi%XLM{G2 J1Oe4)c}g7>svxު̟ 8 1oW- J&?4?>xag1ֿ2%8C NP<:`/IjhpԀLjZ1Z体V&24!bpG🷃)IyO4A|.'D$Y7< O?YKOYzZ=KΪB) R9M2D:!t!)ģp YXmi:\QjH tuif}i*1I2ɐgEj:\ PIrEգ)[[b/V7N~iF;ӻxr7w4a}`E.5*@~}RA6 O78?Us 9/C:֑1x~e<O *' }?zgZ,dt\e僷휇2L~0or&[4s^Ǧ;$?uX2qލY`rJz~xgpluy`һ=FJCW"(.*+DBb cp 1*!Y8mj:% w "2)9 mGm-h# N*^Pn2RuRR4# 6 &¶i\{Ba^+B_NnLru)ǵV]JET 4gU*ox!)iM\I̐v]fQE7OL=St 2 J2K~ç_KiH!b8\/lBOYZ9!%R`Aɠ="LE T2Y%AH ñǮ34Qr:|=qVm]*ԥ5!=Fd)yMWIF֩7B +jΞ$_&n9 hԶY+i`sm횆E"`PpĬ2b\;{ץP9ޟG+I.WgYo/͋w'oqf7YY6W, &`"ٵnx~Ά_9n؁cY,E#yS7bƻOp=pGCjޕq$Bb0[dyή3/<[m$[bF.^,fUlі-YȈHjEiX;0zvfݦM{'KjS: ؤrM^"]-ɷxz_)7wnjgu)g:Ob;/D &!8%mߌBz'i-qrҍ_}IäbM?&`8SzAI)++.FN G»I<9_{tlJbiOǧ`6>]|L߿{%{%z9O70{,?1>cb8-SbeW% ;Ӌt2,W n dtxFCDS!V+[ѻ6iV7ω)ns\y:Kvplj_߭( \ %U+:`V&L9Z:i]Č+0UᕎRWV0%#ݺ]hʽr[ ˚KS{J}~<yokN)ʎy ctL]B9L9L9L9g[,"11`4W,Rg, va6R%5pQ˰ J*VP:|LƗ#YygF^`e\4ѧ5&kr܄Nymv_@>L]'qz-xgF7Y ϚPv^;.eDpgF@oK6QJr~1)X睛}}l70hUb*8AUkh@AYmcsz/6vÔ=^:OT[joMR.W i{:ߘ;uvCY 7?b/^PBD8[Rh: R&势_ց4r_#9aZJg=~bI*d it'+ը&'G6 Z-ImhnF#^2saKcJmb!R̚l#9o)Iv}k]H9v_7W_5%cDޞdJ3 H0i:2ˏRecB }tyoF9X Ir0Nw:[yt,bX8 ~eZj5nu'wMvj[a,w|^}qBz<og#qv,rSN"O4qז3)YWȣsn av4ߙ3/dz}+f|X2>5E \V|U% p׵[ޟl|dDe~w.>rlAFThOƂj*!n]2>nCfshۗ!`'{v±m[Lq%GLPR}Н1ɹn*X%D#%qocbhG09H#=;4rZiEk!1L>> R0|$Ba%CU2HPܣ,|Ltnώ|S %@ S6Mɏ&mivwf7oTX&x8ixq!o5 cѱBpx-|͗|'M̙*D*Fx t2(FTT[X]^yMr}]VX()pƉԌu [~lc֚wmvd+[;YR'w3YvsҤ ͞2HNUR:И 4ؚFڨHss,7{4DIlIzf݃W>n9,:Y`·W$7hkFUΉhS> .*9uXn.`#ʀ0F% Bj'J-DE C(?X:Nid ` by z[6^ּb $HJ*м'QNTK#u1h \*#C .da#"&j8a3H~3@+R:*L%fK} GyAZY bwxiwx-AZ68 :h%Fшa5VS V_$ m@EiK[ fɵS4D~Rsx%\$)7L aJ |Ka=szf w 0ZqLS-JX:P tt(MPjN,zNwMb篾A<'uUmc;yG-k%k):|MG DvjZ_\˨>xaqGHK(^͛j(<$f܌EM 't9*ȸ|}yf*9wsOpTHi)G2/[Y8w}ImIJrydL[\Z|;'U:6HŇٙ#~9!|oŧ"p4N 4!\r5Τ>C@(<cM A'6($qTEC ~Pjwl eA-T(#3X)#̱M{t%£OVEò/#iV :)G'N- Pd Awt.t S#Ŏb8[EAP@g "J=w^!$['A6eZ맱f5LpvզIX0N]SII=, )pPH(r-Vge~rgVocw1X,vip1.Qq )-(RջۖheVҲVvJ!|U^:z\ ,`Tcsf9:7m `ǢG#UeLڀy50#2QrGe1N` BKK?lBqB1@r\8Vjm*F & UR/b8n#c8&LڏvF_Jm)8?FXz{W|^ rsǫ ChU3x2gǧ鑡 W_OտskƓMxwt'7і$ZcSۃ?c%A`bVcKqR" v֬QX@d9唙>:Z iВ!Ddڲ Av.%gRc0*՚>\O=tnfR״ IʍfZSwj.FnVWy%<*{S1.ޯ̪JǘQ&>?G'OZ2ϔ }ZfC`T-+!,;,Dqe&=<:LjamIBq-]b`(r#BGaKY#c41=inFxF EtoH/_6gKT*iCP ՛#DAmo'\sq$T6^pR%L\364lykITDT%'NmךW$?T[X缣V z-|)դZA̮==/+f=[ ?#Z|b)i|WeiXt9+e:w#exsY^? %;g= ^/'4h<]DRnIq 8g?QӾV Wリѻn>fJ]0yJ-֑]iSmcT缣T=^T䒋 Ww$j sQj = ۈ{d$T2$x6GgOHn& k@::}w9J)8qt}DBw)$ZkKp&k*Sp<2흴48 .[9 wZ8hZvj(gW29IJ̩I JHo47F#uNS/2ZÂ$atHS wϮhr(Y%@!m4hf(JکϮ`rveu`XwqV]@ѓXwKD$4ilRk;WfD\Tce_/ iIf V5?2Ƣ*KA^xjͲ^kv|@s\O}u^~\Z7]ʥ aE=VopwgOdQL_es,1 tt|f/1]-^߹|$ -c1Ptݏ a*w =Ve{Wy7䬇ػ\pޙ+uؙpA:DHM<ͫZlvrų+' E(oֻ7Qu+ų+]ms$*s6aFT?=fmnF/ݽ:_Uޗ7_.`YҒj5/C༐rJHbFwBzm(է1qoٞw1mTqn ?b}VZt5F{K3.YQgNPq=mL8{ӽY/F[{xFy,-L׵;p`JS(s@"vEH#)1/甋ia_N3P%^cs;dh9eo߽;e9{r!v"qTd GEt{o&"4h]܌s=>)XTb0\ݛ@7%M0=+$\w~#[?сFfO@ɻRCQ%3;HɆ+5U@Q\ڊ(Q<ue"\[(.Ş~:1EqCNڡX#*nt5*whތՔ>Tn~\nNÀ)Wzo,= =8[²{GGlr U zcF,P?ONfP MX7zN([y )/R9ˍ`4[$Mg4%nN蝸$ 9_M0w8^St!} w7BcʷEKC̚,p)0+k^2cZLka1BV"fyV3O#/n:5g3 9%)$ W+Ő(DZBQG`:j4=eNnhw:&GJʞ}N0 X$_ߣOWIT.Ɔѝ=?=|/}aL0j.`L/1k˄䊞C@Y9_ U6|Ӈϣbyݕaav0 7xy4+c_ޖώS"D*:M 3ĕE ["FǵDžqY)V`m&R^(ΐʉko]xRTPb,bTȝ34UB%K/ș*i$Ggٲal$s5dߡvhFyp Z*g8"'$rEam!+'MVNw@3=Y):O՞w1a1sDrBU@jw1p]Fq\HmuIdsB8/K>A|C-N`uv-I`k&t[DǶUr R`pQ6bncArLp gu΃ϫxma(V•kah's̨@$;kG KB;gLC2f~:}(yM_t37 f#iuW F7h~7_~*KP}˾/Ty31x8A<J}BQhl%gUN s{Jů~a$wy|{9+_+Ni\J9bRDBlx,Vl &؉,h6&,8O\p /$GS0GwT8}<~';AhʄR3Vyq!kʔ~GPZ5NFLǥx>6>Uw|~x,5 ntX"{BdIEl0/f;ZbyD%F_ezF`p<Ӗ!3ra, av&BVJhDG)QhČn{DV/ U?c󎯈2Y92`fӧWf,IHZѢؚtD͢寉!g#z-7=Wg"Xg P.B3f8=Ǹ#鈟?ڟr^ӸШC..z Ru1Ihv8HpY#8kT7_hX%1B;\vuMGeEÞJZͽN:L6ʟi3Gp(H)m"#3f ϋ S.%Νtي"H sG=w23F˕u %R￰A]Xp~v27F be<ێIS*ENc!5תۼz =ʮ"9]H9] gYLRa3}rJXfhi4񮩶|GD͊;MGZא1SN/(<܇T.\Dž֌ӎ4Y+s#): [ʅKX|cEQnI!JPF{r};\+=Ƣ.gsjQӜ "CG9 +XmŋRK+w{ ͌oE~FXEc)xp1I 6(ʝV!gXS.mru.1;$%! +61mI!=|ce! `.0^E^8!ԯ y.$L{2 {QPt+" Uh%;Q!XoSInmpN]NcK0Db)br+RZ$FuNj2%PK׮#l K7i0Z.4FƔΥ0Z¹{bCzKL5bX";gx ͯ\mafC3JdδCMY%F%0z5"Vy7a C!In#J9LZ#9Qx܂Dp.@H^ m8nS-6z\EӃia8¦/|U+ox)Twϯ*,xB2"xcȞDDhVݫ?jG|IƏO1\8+ tS[]~aׇY1|_0l+ZB!5A?LckݥE: A4ո^*kYkP:J֠55Naw@q syԐ U)U|KVĪQP>dÚ6X!Bd~xZC8ݞC"1rrB\¾hr+xk υ&ǫD֋T6%~e pse$RBS.J5}St &$E4 )0tѭmT?Ré܁<ԍ )6]K)W<~QDʶ^EwCap (zmj Bm"AyI&6 ҵmo]S )}f[-0}V8ܡg}8 y79[w§Z;aQM jhGi3r'GȦ H(b&qőfpn7Vpx h]{}XRw0ɸZ O &Gcu ϳ'W)7Ft(Vʳ'{9u~sf< qz=uiGid>4WbգVT%^9@î6#Sgc;,+!+k&a~Wq"`CEu fHR FofAW $M{פsj$ᕃWhv4Ɲjx5B,##lQ2!V} V:[+߽ QP 'n$!~ZY@|'va cn揷cg&O]AO97ε7W`+^GסQfJ3+܉U2:5LzfsFhef4}Yz6SkT1 2֢yDJ-hoV$ٞwB[bڨ=ڒ^?im9"qz͞3AƵ͟5֊v9EfM 9vn狴Dxu}xzyg(-=ZֱW upKfݳSY'KD.bH @9;%9o )f˝KxҁwW&hi|яxvFONcv -Ffgzw3\ H77 R^(ΐʉ )Yyt Iea%"Fl nOI8^d>XjdKY)d@W'xsdTi]o#7WY,Oaq{ d?\_Q֖'l~E=,YlCDA2[fWbX5c{ )yjHqfo3)T?wXj/JzryLKŻ'_'4|ea:݌b= ^2DV-0f9.h!4b8QLɈVnM5FjR{0ޜ؄tĩO~r c Cp:jĭdtnv㉜QX}^&F Gj8$hHMZҞ@&N`\%& EO4"(hbte L.h_iZTbW9uZ MAth~Nнy-bjŢ<ԝJS%=7" p&}ZYҹMГ\(inh>'IZkƲ,DWb,a.]N@8s,ώ-Fs`Q4cۂm3_hI\ARpt9lU3|$*FHIl$ѽun;T|%_;R _ M7ٲiɣ.w>=R(>=ٚ{q44L.~n'w 2+|?]" \a(P`d^CJϊcÀ'gi[Fb5x*6 "~ (Ж`JNN_M.)!kO͸sR (!NF^wx.^E].P #櫅JBIKG Ng&'`fHtǶDchŠ:KVL#ataݗ7ţ]׀Ni_.E qGqqgK)UN""km3}.veU#ud"O½䈹5{ю1(;? ˼-9? -Į*IlMu`cd,Ezi QP^izgf2JZUJ FK}+-5ћ4h"zz;,(ݟ*ivf)ivfդdt_ʁ*弧$("1R<8kbe +&:H8-dr?oFJ|wpь=8 y|S hi0 ?\0  bz^ʰ!L 5B}pWXjC9@8j[ 0 ܰz0WUmq@9UpziP -pT.#b`Hjk:P#tEQC=6_h xe?Lf0ê7[PX[y>0㐑;(]`l8~ e j77`ր+C4jFKSIM3-W:E*Z+j%:NP=ZJ$awKzٺgS_8|A8"8-YTLDo.2e *rD'Q )ƦE$!+H-e× XܸVthr 5"A?- Co( QY"Kt#I/Uc24^&sph^Y'wRJVD3s1DpA=is 5܇̻wRJ97A~yWߤozne=Z @r޲wNe=ݸ]E@yb菟_Yy@9?_6oۇxsxlBJTO ;Z#A+iZj9e'( 3$кNڡ vÂgSj*&- qQvx0~}G>~O Tx L? ;͚pY>iU(#3ztl>o)3 •#O-V`4N_945]IJy [.묇OzYKX K6 ªKf=|,4Պg0@4gzB-D\J5kԆS-mdPiWAn pd#_F5[Lݟ ;<On^m'*Lpfl.(c;bÃ=]<ݻ#p=#n"k)! IùT\.Ɯj09?ˌWmxb@\.М6QKuZ8^ԅ7t5 .f7\A`~VͰcbmu8d儳xg#]pі8䫤_D0|I /|,pL" ς ֋ *Q$WoVoH&@V% `ҀR P xzy@>̧61of=10,W [ p.Xw7r ۛNShe:"Ƽ4_ o3*jQ+T*H$-˒rB[e &]Uv ?mrHZ߸i4mK#ra/'p,}ܭ/i$`*ŕW5H* 'bFY ]<HƷu@kWxꥧ 0iXI1v-SYi OOd4awqKTy1o!د=%҈gK?yB[}޻F[ |&)HvF8hhhVuN~g:-dt~FnpBH…BIQ]J-S\[H48f` Kg=8ԁ1>Qv  7A h 0إ edJɗo}4~=?^|=n>+H,~0,0ΓAU$=dݒ, 'g5՝ՙuK,lVF}wDV ;02TBXʤc Ȁo0+;|;g;#\4h'bS'M4@NdӒ]vI$hL##S2 G)Xu4a#1⣽3r xD^ Ƹ ٞg`=sK )ɔ>Z D^-NQ2&l ĄEpC$d!;OU̒qJ#(HQCyᢶMnJ}tDw<kZ~lxMZH"y"VM#õʚr7Nœ݅hFYksYo dY}gy)P7 YߕY^G֬d$U-WEU R+V_?*Tl */YHVō6hxgUV0F B[iVvEljtȨa^e ja ,:bqNG0~AKumw[fc^⹪ObNV5})qc4I1v16OlHbCȱŠl Z#S̠/]y{t-I})ǫkI:(g[Ĥۗy@kuFXj6*ޭO~.ZۓV &!o2UM&ը x73JqE(b( O/V=p)Rӏ^_WkXY619FȦZ /TN~Rx+ S5.ߟۏYry}urmeDl'亚vʳ;;g)D5$:61&?2:9f0;FI'|^USǕ 3K_y6ń$I='hO][+#/,/ʯUxy} 䠋?9 Mxf%؍q ƱRN#P&4iDV)md آ-K%5Lw0DuZm`m:j5^<|+/hJ=E7_uN֘qaآK)Sg2uz?^N"#0Qh:9c'Né?y:bKex>=Z^âjP~\^a4xv@x0'tQ1/nP~1>eF{@T܀o&n˳S3Z;%ܳ@Sg={jy#MXK ֳc n*jWGt=]ќ[a p9\fG!K&SpI$KDyNmwVcM,P#1>oWi&eibZs|MnL*]yObiZI+~:%K0$9B|]j[ RJ/"$Aq3s!^.ps p wV*4[i>dFK6rOg> R[*Ddh KFŗ(2b%G2`- DLq (W))zWn7j6[%mnq >ۋ~Y-2GNz#ӭ(.l{4y"pΗS,V`BQ拒 k<0gU7#ʛ̹5}D1yJqB"&1ɅLƔ8pbۥSQ aIpG9u]5+?%Fzyt%rɕD( _6\kF IVwX >*j]ܘw5Tkxa+vN) JstkT<OhijмAz vE/v|"EkM}m.v] ;֘cs=GdW-sDdsvdvd]%8t!=G~AadCM8nF=I9'cWBagOO݇Da:L1L_7;5|2\Iht^*&/z<PLA`SL%,3OiuLJkynr T'͉䶵GJH~,bdьNRJ/U9 ،?vkr+Cv!TvA.Td&rsi\ F~A"1MdN b:|~"FZaIdk%;^ޢ y@$V]Djyg{><ǔ.*X|a c t?u/bT6DUSv IqEƘ5;qlP\ |gsL-Xȣ5&)4ⵗWpKS\D~DՂ nL}+1tv{ҵiaz&[oEɋ7)SEx-[}3 HLj'mhB(#KZR2_.6ӊB#LMh3D|viACPLH+>ϗ6>!i, ȖZ8f%Ky/ *Y]0&ha&ryD5e̘Un(QeL+\Y9b;Nl}u"rΆBY^*Z(IF VW'>ס56<|T}^T+f`ԣ:6(ܦ7-M6D6JylvLL]`MG׶;V uΠ\.JQ^h`/H2F Zse }$ފh%G3R%q= sH<7sc{ʌs_dƕ>Sz̀~@[0.q؈hk8.ưak}T+mpɆˊ\a3w ̕e 2}W\  F{I;;^1g1(stX?=ZAONsxg *sUvEelUbX60}#"E{}ǤB!g8#BI>h|"@I#JGf3|$e'`.`)㸒\QGtL %E{ދJ#+ت+G(.l%^4,_^dwW芾5uWϹ "4wa &Xg/?b{L(xuv#c(ws4biXzl?TzS[F ǩ?a9@"ͣUj!^ RHZ^}8ٚ#O@ψ,<+ #â邕HWcJ@ϑ L'\k>p5;K3& E1$&=/>Sz-Ne$c{ŏNӀY>fV`Xttckw:2 LDo7Fmy3> h0c(j ch$GS/lf J4 Zs*}ExCaQG'|C/3gWfa1XNǀzQ8>b$|ɿ?TכܞE+b߮./\xa%luO jƝ2Ƥ<9'K'FO~]SV[,A' ;6OO mI] oy÷mmzljRsuqE2lFD+ix-u19{wZ;FVN6E aRA ^2D4eLsO@<I=`J;g&d} |%\1 S_pIOn%1 {a ʔ \*lp`#g FD;Ngkk%^hDBI7u/QH|wggb\ ̦τ[Y1dɀ8̙Sd>H1;ƤJJ(_dK,Bc\&LAK](頥.~f MO8b_j l@ WZ[j k7?Kԓ("o 5ٜ/?_KPA]5wͻ7z|QE]ϿۛJ|8r%fm'͟æ8F(l;|gR"Pr+,lRN#+ 158 q¤-/P|8(@*t y(a`J'`GN$c5C"E(DQ@Ġ'&Pkt7 gJa\SW3ñ rFCOIs) ?‚u%#d n3?j%i\&0#%Q /,a4 k5LM2эrx8ms G zX+̪}JP .a|iցC یBsi*-ޥDF6 Mbj UZ8w&:`{H aӀH$=qjnYBz@%ˇV廉CJހZ0Du:Jq8^IҕaST8ʤGTE)h5s*!I)p~zf{Ey]TwQ[Lp9@ 24C*JV*|ZOoD6Q:I0:ssJ;f`qށI:p0rGG6Jl&.l-,,e-t[pn37~-qX_!V5.hLƃ2 p)!űG׃,,D}%@SQ_l y8t ʵSE uJj*!#([#`Ti )*D.'KW)Q0{4t:o аZM(&*P"TwmO_֔> mP76EOK,) fIiR$e)\>}fN%.'(ٲ46k)Y)rCQ j}r<|ĄpmDr ='5!,wzs`w {w[D?]&Yʽxݨt&[s~(Fo/.wX"tsrj{=U 8 =QʕYT{wsV^F |5t,A[B=;/(MHj:B!MfRvx_: l@Mcu4BWN R.Pv\ɄrE9g<v3 Zo8'_ \5MLdXI- Z5mwj_-ꂃ62`MRzǓ +J0#Y0I} @ڮlj:ޙJB3"fe`) h̞gXI}i#ahM4S߆j` L1funҊRTIJ1|>#HZb:0a4^lZwS|NuLܰmŬPblΕ2# +t.EBm8l(5W+OdJ@kT.:3daљ&l&&3 ֎v_ǴgnY;G'uџd> aos 8{a64tT Y8/۴ z=iL ѳ7D#"E2-;9![<݌; ɧַ);+]+,9BY80F# . V9t_;7U(5nPydW!өv>Uft_rO$#lAA" .Q.yJͧkT ;^pGұ0\Fc7Zw/E0flqSGHջQԾ뚐-7 cŀjƟr)qK LR n_@jo_j*耗0R,N̈́hLnׂw_9fٚ Zo]?L?EdTjhQ%"IԌe3ey\ oْ! aT&4'KkcƗUp2`yB=?{wbfClʶA'@UT{M3:@<˵-XQ/S)jWh|c?<|ɾ0nV!TԎjy7 pKU!8/Sd[ldC\4[J`(1%]-W5s}4 ~Jׂ_ sB*K|`fTЖhؓrg]b8%x+s~83az$yzk8{0E)=щ {1G)D+<5y \}!rxYމHY5QpBnZ6}%bpjjQ|ЁHrL, ]0.BN} RD71|C6[>CJ˄6lWG݋HG1i"r߷|9NjPEdZ'zpZ s*KQ7'be\wM [ f(-BDԓdq*R+Y*^!|j4 v3a-,ݛPj"WgT-{&P>sx٧2ig3VAoq?MI'KC&6{lޑ~ڌ_a!?0 vL%yaQ$[ uU: `ge;OOLGTqȻ40>_Fs0^((饱H-a?ps*t>Z Lv~npwzw&հy}o h;8 I=C/4A+}i(ivl3Fzsi&/ThϱǮ{c|\^Ï=n8t4ZsNϵ;{}GWͅw*;:7ˋ۟./o\^Ň+O> uvv׿ܚ;okt_vd;:? aL}LhGF|Sďuiz|yNb.ci ~M[3bG&e?.`v ϡ?&c{vG}[٣_ϿACkQ<,AfA(>w*Q`w-~WU"DezJ3Ȫi^a EIsF_DDmn+]TI&nBQ)4v'&}<<7<ꗰ7_?7VMv}Ҳ@syՒwvّ%F`x26%0u'߿ð9;Hw\CjL_x2~ K-p?wv+ cZ\zrPIw;Aנ:ҟ\1>s3?LvSd6|Ϩ pP}lmĻF~_3Qx {/ gAz_}__`+T:lE`NEY-K$菃Nlq]O׬kb7k r]#oq%ZB-)]ْ^KpA T;PPL[Z?2Ҧ:.4Wx`p?,ߜtq~_ހ[1яM;^__y}O߼8wsٯcm*u{suzͯ翟wӏ2ިJy~܏΁?VeѝSzl7B!Y sqR3[-E̲B)GH ~PdIQPЦ 9]dG첌(2, ZDjOK!+w) t=syM19Rȼk4݃oZ\:3΁s *Re(-XT昹ssp]8@./?p_BJI[!W|fm1ٖ<(&E: zSw;f,9>**FLB3n .whyޡu@)짋ءӆ2. O% (Oȼ沵^NnMRڰڞ_޲w/hofڶd1066S}k]KxpveCmHU~2yi#U?짼4)\,/ b,0fپqu3ziyj%V$cI9Z0s\볖J|PG\7*ԭP^x`Xis6[2 gf\e&(PVq4Q*]8vT\Ы@KQ^(?>uZ+ܶb޻5cY=to|Ҧ>{2[49,1{{| LFG qT6Tc=ۭ0tpFnV X( Qϗ)Z/SW6' %6 - ,WO|DC5DˆEZ(٥r-8fȡ8ޅA  Ǩ3No|H "JExoyQC"JEZdcDu;~cizf]ekz<uB!pzdgӕݘWpt{p;+*_wkQ&cȧh_= a ǤÝaTF $ߝ.Gd $GJG<{=Ei[ūp>eX4t8*4끎L׆704 ¼{fN~"{_q аWwxWp`e +&t.0 >L\pí^C&4:uꀇ -P5ĤK"a4v2inʤU`MIOa`xFp+tH@ i-H<]ͤm0GLـxI#JlFzrgZQnIX]ʕZ)Q 4uyeK;BH R8aVTrRHF6ª6qFmrQ#B2%%`cY@#?KČ/[G[Jz:_"uX~?yJkQh=V3xR)Bi6sB߽_楼\9#n%(-:߾=Z +Z ip/(A8~6qIFHiO(޿X3Sާ o~sd2n)Ʃ>Ubq}N<ʮc8M3voӂvHIbsMBw5^ڡKܜkS;tAmP\qn6C]LmeS)j}ũg*{mA .#:h9M̬5B(NF-A8XξznkT.I0TL-h~9lmt%"oo9s^׊6+9/bh!|j:7`[) }ҹ}_\S}Dq'ɭARugqtQNݗ 8ٮ[2V(Qz錢ŁkF5wl*H15nTGU/3 b` 9K5Y3` #.Yx-ڮ{̨]M"^ڕ = +Vw[)|3[qR mяEQ\% dtUr_A!8ޝ ̘T1Oe,gPk( Q=qmԖY7.8l#~ To_:Oh=!6f+6T?=Q 6W 쬾/W^|\ /2҈bgMN=˓ 8Cp~lєf8[v/?*̋'ɺ/>AT'M`(g)ѵ}8~s<@\#~p[v-g˽OK Ru9/740ޤT;En+2r*+8,X_IaC-S9"齉j":0$XQx O6/)=&Xݥ47+]U%ʏ$\)%V6њEMDU@MAMfbt)&T{wzҮ U,a"BD&jodWg0N^p|g/Ӱvt@u9 &<|6mJ?p+).%dzNJ?rh(=G_ϖ.ݍf{9q{1!VI"5Cm[5gN<M&<=2>-neʰ0V57790&aLQnr uՙ#f=<=vaF7Ѣ8|?_~= ZZ{;zsTG|| 3yUWn3ỳu<6M LiGR զ/߶{W3Vh>k!C hk+u C*fXwqEo&KNtKV(Ԛ“QT?DON{[ hƜΑ龜 V.N1!o1DbL-iIzx_U$9 (e(D$h- @Dm$Q@3J(]߆r/z=ʥP?OFp\mϊ;GӫX{&uh%םI1^;܃e5 c h={[ݵiLe%[.\).)p-eOW$W\WrOj@/Д"x\ FKTq5cѡN9%FA?Dv5zJz[CC%Ҁp47!@'$[,#*B/Be,QysԊ { -wNǠ'L0ȝV1N9(F9ŨM܀`:*c)ʲx: Pj '{e,DaQ%" f=bp24 .e/$֦Dy!'V8?\ U{=YbCDZqncJ폏rmWf67AYi8e Zr;CX1Rn ٓy9|~H빣6o=mmgK?7tN!Tb{xqE}q{ |7@P/V-J2hԏ'uj5";zrU|{3hlJ6FӣOoHAKel}]М!. 7U/dy&m0#UT&aA2(;ړnl^RJD-A`5F)I4=T9q3'㚘(΋㜢K߆:E-lFr&!I+$lFnV{~T9QGU=Q}G Ж Vhp`}&GwEː$^}8 Z݀ "NX+CqV ձ-paDC E&:J1ya#)49ǂLrČ5cYcm͘`E 9ٔ`dF2/^jJk!v럑\BIsvIl13wwv FR㬨Ҕkb lW+il<»Jop]m)TgDత@s?\f\&_s}(_+Bz%$w|]vE'l;n0y|:w]AouOfzX9)$}C)N+e=  Zq 7M޵*x%:%@,9kۻߐ6HO)cҡM-Vrl"Xbp0I=q2*o:@0y =YG2,$АԣhWEY"NI&Ki.@ *Tzƴ7(V(M{aKST(ԋ(Q0_W4A nѥCkPnSgGFfC6xR4[YkWqv6x g#(Rr B ,{)r,E_ZLiS+r/RGٶهz Z+]*$[]$ :L>~{i*hSԱђUS@H]5|0!PN[h@=`jo pzWsM͍UJZ'\JHMHbJdnbvT 5Kֱiv -:`lN.sCxE Fj4ʅ ULvS=@!Rf!Jk"jdJ'Nӆ)JshUyh*4#]B]BV48 NArj@kMq $o>GjfB;k}ms&W9m8$k=5ť=QYՄKA?%%\My__][oF+B^fq(؇ 85<Ֆ#3#ɯj4fș氛lj臬W^K6Z&5v3lurTFkA6(Q(w=a[Zvݼj3quMyc*= 519,[65dMXx;ih5 6{gƦ*ϡ{GcTkYH"z}|၏6<ц>Y)*  bZ{2EfK^vvN%'4BL:z^^~|g-l {~{[x'$ro3+uy!)Fssi8Ԅ 2%+@?0'gvf&:n4 >YhK4\f^Z$,`'j/XBxwo;k~"rul~,Q-߯Uz}.(.%*Gtk$ջ ; r8Hr(%hIPDrIejSLqU$]4UR4*(cK8-cKՆP:}ϵd@+ZElE[֔ &/5#;`PcKcPvhN6'hsQDd2BksrY$EP@TH2>}Zģs[+!S)R_l`>ʌR.`[\攥ׅo} 9%^i'ԍ *{=Jw ׌a5g8M)DwcɌBy~7)!u|n 2%-^?Rl WT؛XSL+Իžǘ>\Z UItd a9JJnX@qAp/v?m|o+@a"v Ԇ3N6ً{\8A*DFo+D M>Җ8#T;^ĒPGu S4i{22Ud' w'2QTа8$}!y$&U"2cp!^Rf4qY&7*7)0Ǥ*hjZ4Sļρ9-$ rtr^n9vbV_ۍ]U9\^ %?G#fi7X$P5ў@zuXեR^0:G詚-p:FڈHIלZNW7BPݹ8\+]~^xX8CM0QrZ8r5N=ޓӚC@NN@5w3^.6$ow}Vҟ:0Q\ʹvYt#}{;11(OGMnGR\- N:ixwe|uٟgȄݯsܕGp//gW//>\Zl~Բb*~Kɴ ъWOўtz f2!gDU hO fӳCD! 7 bA+r>|Y {Βs614@$9X6<:laC J5tQz n73\ˣV8\':Cp;dbC<~D wGI aOu 1F!))\{UɠQx"6AFl`  Z@9D~a^1@Q^Pv(d0wv-t zai >a  ) ųT&r^J~DTi.R<9pP{4;jjV9rassysPn=H@pGrF:Ja L)Iek7KY uY`yAx¤&I'yKʼn TOHU ] G#;߳rN&]fo Jbv'I 걦_wf"|q@p 9YƑ"9Tϻ_TH|^l/܅@$1~:ݶMKK-T ͩNW&Hvԋt( J//8ϢѯnNt ҖNt._6;0 c:-ߘ2Dz@0G7yv0\(=·_oדk&[-cd9,:6$ZK5 JK6ZR^DWUO\@@Og lI2xV2i%ZX o5PH"NZGS~>#d2ڭTT.՚9P<9w#M]aPĜWpT aBb8`鸊,20ƋZ^"NoFt13N:q-۷$Lt߽{HﮯwO~gnJ~ג 廎kjmwssc/UʡHYfzI1TCUw9,pBNo$3/,S&V)/bv; +]d!/>lU.p}efhV!Y h@+sޖ̣f@g!|oQbC^ٸkVZNoL19_{gFlZ,3q~c7&/-9p:[-يuDg27$HFLUPg}$ T0+8V53""mb]hj.8.80%aFÁ3[ynJ0΃cʞN6< $& g~ޛg?$${Jarx4[0Զ{GYWZfH&cY:O)8 h$t% 6ZJ-X JDFZ |Y R5GP_9R"Ei*L?"$0Vs [)RNOsO+(@d﬇y9Zw=O'_2YV,%TH2UD J**W e1,ҚqaӒS1kʲ2ZJC@ݾrk-xNuE%8)EAW2xŽU.`jƛ XpRF-t L@tu|u| #B2=?d֕*YE0!(/D}RИQ 4dǍbʊ?6%D)M x#4M Č椤ݫvVO8|k@@FevvA}oC!r^ZJX'\4c` "Ns xY $ 8L3 e!g鋳[Y84|TJb0bI9|u8$@ z ۩V?㢴̎uę.vF7|KQ0Qy~F#P*Cc@VK;ݙ~`( gx,lZr5q!/""2:5)z9mpD. a*ɡPT&Qm'h:q.N@Iwxwb5 E8Om=т?pkp=26AzuQEe *owz!Шi`IN:W;|JMc܁vōQѶwYE6Ν|F |o”K#H,s\'J\窂%:ơU+(>,C}=Fڍ*BEBJ gx1ZmZo9-5ЎW#wXtEJ_+iӂݹ;do9496w~C*(3׎|8΂B4O8.RJƥ·~]mi7~|9O 2\z?UGu$Uiq/6v4j?IO{RӞ'i/WYcҧ=Hk}"~b*T`F^xj"L #{a&o>f6μ/8L<?f{fMƩ.ʨ4u j)bz;A؞uvnܓڍ{܍{Bu eNSЉ-ŴIS #V%{sY8* "Kml BT:rZvA;> O %Do9߅R%좋9+(e*t :0[VcWCέX}sMZw<dQJmVݢڪ$-ǟ`9]g+J\!%K,\8P1*} Z$8s'52SC^#ZUI f$L`n.EIvcೲY 0ixVG0DJ$PI2% VHju *GX;* s$M5 7ҪΈNg/aܴ F(t Өy굙FS(cЬy%XQ"'v[>%ңJ1>'/V" _QPmO <-CU.scEdw,V%ɅV(̒QGCc,e"&X%9kH`u. +08l }09dSpAy}(-+-(ApJZRY)p-qfAcУ}$sȥGؽJ{lr:✉xQu??A}` ;xj1Og5/86b/ZO,O' ȡSdWa(hy-vl+UJ96_}٘fֱBni&c{iFq}M%R<\A\{(C%5hhJ T$lj1ͦ7)\Qs?Z=;3?N̽ViND!.]^g}RߣCEဣ6)_Wҝ{s/ՒpC_J#]H;^ `?/l0 I5X{v >}4#z)9P^bcBFB$b,،PxS<8ͯ%M.]#p_[4eKvûe`\k,E /lɑ`mFhm$յ ͕čUၖ#u8[j;Il6I2&%>1^JDpVg]ce"K"6|)!`)t.Z e>L0k+@2ENjUZ rU,>Hy,@$p*+"]h5H) $s^J`c!I0X :v{Z%zz$|FB `݈}PrE:*eK?|H;0>C-`G?e?}}d |1~qijeP@_M&hŻd֖8`oی7{vdlɸר욣t}9/ >:#7pH얅Nϋ.],\Y-|܎LrG*R`r4M_OcN=K?pH)b"7&{Lӕہ8<~н?L ԋqOڳ4h{L;ʬi %Ryϸʎu*ASA5RC}{hwΉnǻap\NR<2 Vk٧|M-o>G~VC;jb`~p2AB9Vn-3s+Xvt朳2z^V./4/gх<> ѣ4MF\,Iq=gD7QV)Εbo/m(gAyUb@g,݂$}d֩x2p9*lBcw_͒)9) $Y^YܠaM@)NAa4 {XZcCm/)-IH5)Z%X%(!( QrLƃGʄ`ȥfKb@,kS=.~q_2'w;`}qy@@3k[순"ǹb^׃Opi ɿzg˖lWsE55B'YkoVӳwoM ' &.U6f[qN9igZ*Nyp<7r1HOf;Zdrqe?[k&%vQFMfYp`G@@ɎVxt&00 1%ZVFfsyav 5([ bt"k* &z~~Y'pߞ}v[nn;uPg_jԂczo[aՎTS4H- uӲjWm$ȪVOcVGiqPoTB.9O8UN9|8XM8jiw CZ3[}rIxcKĻkZ$k%CwXθ[l`Y~Ç 2ax@u,bʶy܇= GSQ Q)F^Ztkغ&hZȺ&hಚ[?Tʠ VYs+Z۲yޗ>"(w[Rkq]4b5wf'u8V|HrޗKYe`ɠ4g+=![G=!감U{><)p5wT1>Wvg"F#:3@sp(n iVRV1NQFANg/麰48  u~Fbz *e +9xGZf 3kx .<^WX2&hO ƳۚN S^{D6|}Uz1sz9v!~ܡL(sݙʄa=p0Dqwb?e^}eGi~ w/=j| ` O;aS |f Q i -dFklnءPS8 t.>}-r'/,Yy@ w4lo*'^fDݨ EK^ހ|cpWV׫r{$? V>⛏KKzUxqjsG<{(Ezһ6PQE;%/ʵDMB.~fJ,'!Y5/_ &N`I61 }UӖ|Q-}LOtpNԃO V9ߛ0ů:+/VtzM wyȴ`÷! Wx&`KYvũeJ;] NőD^'Ҟ/$^4;\:'FsCp9sJAYYA%o:'I(y91EpYaTb- mZ+M~g aה NeQCX]VO  K<{x7N0c62/z`*{[glt:ǹ_O ؖ^qOÚI ֨%8X r:}eIBP򎳗Ad6%dćJ1]a~ %p@ J X_+ ~1,~K k~qxKǔT_e<~_X녷eLA2IVt uDAc.!J>K0&+`wŕ)nt]iѶyُk9>|ee/ x.f፱(XÿC'1{Q?7THXw IxţH 9`\abA[7J 3g$CdVsl/1Hg5Y)G׿r$Mn0XZy4Tz#<~dzD 1iѮIxX̦e/4 &JИb^2!x¤,5,VJf_w (fh]x;=7:2\>Z܂,}5=wJ~RO,&iaUd}aUQN.Rs6} D'<3EKG%8BYͬ, ׫y}XNE P&0Fv%8X8 S ći@&"9oEa%QL p߯_눼ӻ_IP]s?sf/MOa[o>t/![57~`ypW&oacV4Mbk>KWX\a@1X餥'R3XY4vL*uCR'[+[,aRpRʆVr4yCE\fTp1LrBYLb;ye-)kƷFzLYy,90ĉg-OƧL,MM;ΕALp4SUsϕ\3rIy"}㱛}egzZ7n}ge.I DJhX ]鞋zx-٨Eg.i53 Jֳ~.V&S\hyȗ2> xۮrsz9;/hgtM,Jvs ǞLF3ȬD*sD9$NƣO%04xtdQS.<85Bp kμF,֚Ȃ,GJ[KX`^%PG"C@T:(+cA- %1 {'AM &r(ߘux" %}g.,eveO_o z&&mX)#EJ7|-~ & H.oĜndAaaxJ8^}y6 |GxGwsoPt<4o:''-L NKK2'YpWTBǙ1DqSZ*3Gq !TDF6(Jk]!x9F0>5.VWI?_dj%_g`rBr.DF->G$3CL.1l[8gB[Y IeB`!鴀E=rHRi7EBe#J+lZ3;Nm F`UCG$ݷBbGL^fvQh[K?0; 75>ܿC+Xk,(#>~:fGHQ/?Fd|LlNLm0f!HK\=erz7MvFzʷ pp*hUz! ݲ C@[m> !*ZG5uۙUw;IcА3ڋt,&V`Kʮ 3{?}?JTafnũF<;]67WL(#h2f*]݌`HQ֝WyZo,fL26'{*V奔D$mx@YU i5I\3Dl]#cYq\R\)[)K.#h%fPL׎W٩A6<1"f qw; 9F#kd}XO4\M~)K

<\`?8o NvYwSWg5iRC`͂=42VHk!([x?yGQqL6G8j*Z(Zz qP*ำR{a繶+LI64>mkkX\z ǑL}BMҘ791&ałW`_'S`*jxZkIbJ@}g^R;Q:0 EvTvw;4B)6>}JH$RaKI,!ʍ"$Ӟl^a|Lݬ?'Ց1 w rw.ϥFx Inf ~4q[]s7@sR |r0s΄&)0[S(i7:ń5?,>-*L5V7a9u[#}ImD/<#ff%AXbk0#=%$PPTE![8Yw71q6Ku*ŪE3MgUC~ 5lnv.@ P=<6~͈p&#D&sT]baÖ'LA+)Tae p:mFEl`fr8  -$&+d%ljomSbY0h9ܯB(uԻc z7gB!:Yo2DfAS)mƞM O9ěN8-THJX@VX('QC*bvt*^Z@GX[ @wMX-~$ tʦ`0 `X4KBL`8BG:XD"NjRD$p'RIz)ccA!aJS%рp\ KFEQ Ny¬mc Q3CesFz~sqj7kEPyAёe]7F@k:[SGsu`<2 ަnjQ4gzYxxLj!ר#lVEQ^&,Ux'·Z {۔34둶6<:6a$IvpT[<es7/\bzؐ>z )͆kINrSa*mqw VU_Ê`}diKqFL?@{~b30O뺟eulf\|1ٍؘ4ͳpXfy=_õ@(C f⠼?\V/>4ERהޢ-/}Wx&b7WLD5[f^̕8>GEf䶦+S~kvc^ucѪ+߇luu2U22>s9:_j.'&+w?*Mf`"x, uXKE?Z!]?UEG)?bzXNb~;٤NR'ޥ$|X&Nym|-3_b ^du`?c4v;<$SzvYHUBbT]=ׂ?dݳX'JgJ)gQk:Ґ߹T(u+uìFfsBVAꔾu;2,[@s[;W")nH|vBÏX{=CyoPj; PND^8Śp?bEy5fJxE>"N :a-Rk!R=#QKSFc1H>;לC`;oY7R^mTOJ 5:!(/b=HH ՘";PWTc )0؁yup&DQxQEb^h(jJ. {@A q.*L=H"qEǂ BnR GAL1Q7HD5UrxVg1_.ec9!`#1a* pXDQz^"g62o VFCp"0M $^!8B3ԩ2h/xt H<,DZpDsubD({6> Bտ4C{tlWc(k87 [*T)ߋ>7i[d)͂I;."7')/x+ 6]5[V0xpxy4< D46YERu<_|N :Z*6܊Z;e#t1}7p9 51p5VqgQ5^i $X}ǯZwF-@x{ CMP ǯ/S WA4=lv7:1R:Z\II΁59L$wΆa<`0FJCXY+l әZ.yhhuYX', Djn5hV{ΣPHT-RS,ii2Y` 0B`QZ+,uNip+VtZhwi eqh x,8bˇZ: U ǦFUxV[R{PfI?LC$*KʘMrVJ <(EHH.qΏ Q+ך`$u@~,!g=z'2LM=X =Z':mJ'e)%DCmЄ؂jKwBZ"-' ״ߊcBgY |Aye8fc-- 3k:hQS#Q8ˡT%&?!5C IKV@ty'>hKC@Fv"&^#@eݬ-y+ KAzBI)yƹadG(!Bpǃ]=I ;v$f=]ucm}zzIp;}14.8N5N4x:x⃢jxtp5ah<2&FSTj_zC  ǧ+j)2Q0 }نuܐa%ͶNs(w XX̐\z)NHyYۦmq擭4ï޿!M\= saV[-XŔ{+tɠ_ن/^on5w:aV EY_ۻ%j;e) \߆g6ujb˛Own5xD ;A8A"Vap~(RTEee'/ hfEW Cc~yv j,D~[$ZkE7.EѲy=zC,.Őo&] #bC9\tv\L45BDs J OݬJDPX[nWtAŗk16cưƔ|)J 8C|s#(BH$HʧJK6a: `.YtS ZfFqAQ#@q;iD&hHx(v2FkEN1Mr6ZV60ƱHqYh,xB(Z39,fgm|KKCZZ >Weh|cEs2]?eS>u-Jva8[7{dwo!4S?OR.~~"?[QqU=Ub>_,-^d(}JbZO k*_gV 6%Pn#OLK>N)SM*80w<@f)uU& /^ü^rW͂I ssBa$ݧ7Wg_bvc/6HE|ywJtU~ӶF=JFIkqbZՏIqȦ#09o Xi$&-hP>*Bk#XK.EEgu8Cy'6MdZZP Ǟi6}1 FiXQ b y1lA!zcx5I+7f]$U9it{yc?N~"Â7rE1%Tqa/g>:a 'SqNuZE*z]_E7-|mt q1Zj&DKEp42$Vsk]N6u vdmvn|wg*4NTB6/e.7f_m6iWgw>,}|)&HA>Ӏ02Tc#F[qtr~r` $]sϕTȰ`c!"ଵNAz^1"!~oާς5 "Lws!I{ bNsoAÑdHαVVDޱK9-ڟd \NLN0"kP 6 MA4|䤓r%s(qFEC7((,2)ȀVGV䴂@/mxa4ރ_\PtbKbtI`nR`Q˴3˕ig6 a>A ASSKpnNT"`kTe$q+"ؐw=]X#1npL^[^ZKZWKI.-j/]#UKKHEBV.exoYN-;*V$My#+or ")Uᔕ7% i(iD5j#pΐR}>OVzC!R=tSTry3wA~Mv"ʢdGlaC e@T2YRaÎ X$ \/Dhɔ,0 !Z:0 ](2PPRl-*Bu퍴򏤁#{|sNkn(0jc^@R%`|`p-1+eEȄ|{sG]l6Q_W@T! kKvV@F^Yg#I\Wn(])6jemˠdc8V8ia2 RQoKDaT*}bW)O'|_jndjl0Vq#&a:}7}rˣ1!] 0s{|ea?OAB\%aku&G;KD^8`ċ`,\>܅yE˒,7,:X[.)6Y 贒ݲ_4ջua!߸lJ%ڔDt]Vw_^au7D-gx+sƵ6Ta(1 $KjH6tF#Jz=30֖lX *ظ[r$'ׯ BkFX~纀\)\|INLJiO–gZ. c{WS:̲W_+A!)?:oJjFį6Enq7Fv&;Rx  n?tg2B@>[)[:C#Ȃ` e9ehx˹F+RS'k͌@ENud؛q1 >-aҔgp?Exx)h-Y{[t g e~ΚQtH ›V%۞.x~w]<=>}ZaU@Q6B$@2ޓ .f' ]Im\&jPp1o8-L +ü{ e1F[yM_/}LQ1 ?FRc/@﹑:yʼnd18>rK#|u@hWgůsOZ{z3OUp8\>:\z#k&F;G%D Yg )O1)lsxBJn՛V?܆) {j˪~8u$U)'v'U)߬˷I[bz 8e5gA.eMBx+e=ILE8MC Lθpϐ=w?i9 0wŃq`yCqߧ]),雂3 F&e+Bδs+BΤ9ks ~^VN/}@lTo)i]k0qJ{7.\o;d^Z',X^=.wQ˂69t1(.= ܒaA:uby;BvBw"=Η=WLXWNN{Ť=0B'GMaŃO ă%%06[Ms4Q4;Jl#IёߡK =rR#huP\}wf ؼ)'|b`/>ڸ\f ԍ%L)[;] qNv;U VbY(oh>8f96lД<שaΨɭ kk aD[]2֐$QPY*n7]Mʮ kI` cGjF^8̐և[1-vD"`sN>dWl'<f<ҾΏ$ֻ|M:xg n&Y ..u=港&M$8(Y7F/ kio5.¸`ƱBc# Ḥ"g1]-܁8u=>ڰ3U;IxOAk֥Q5,ꖢ )FA^0 -pm,ɲTıLy$ŐAbżPdĮ491!Kn+~%n) eōV'F6~g&66..& L`)V ,- 4Lr -Hhg0 ^l RSEZԭuk߁I3L3/H,Ջ5m[A$\2Q׊ "W"-/梭7J!Z VwM)$h`{BhŇiT͞kDeYV焢bfxsJg7܃aZR kJI9\ku%a*s@PXLl#$iu_9PiB*J&y""IK F"- VXHg8JWV.5|q;Iu| 0?*JUCΈe1ޏ@J!5L)vnMo슝)xw$|v^ea+uRG[8j4R4 }OZY7Oi/v\4}K9 g2 T|1NOg.n}mzŪ[v$r-;nٯ~Wn-ߨ?M+=s!rba \@xbc U;<|-xcw}|Z_>V_vjDkڻS"lTWn1Rn5JTiZ5)OMṗHA+)TqW*][o#+ټ_l"M=BlGggR-[.fZzVUXU,VaZ]UZq#)p\bF>3v,=vH(GĜ1o?=JԎhވ UJ QU55 ck FT<5hQ*$-(_W;goϔ$ɘ  )Lqmߛ3l؛K 01TC72{ Ӣ)3ϽON_2|~s9'qs'BӝizOݧ_1*RcYëRh(b:],Gx#> O4ct!S˧/qb99d|&%=ۦ.V62I;"2ִ@5vK FtRh݆Ke< ݒ'rH;ZxQ֊j,y*ՊF &ȸ}FJ v\n@ULVTd*tr*b>u6:yPIdAZ:9y:9sM)GnQөĈNmۀBnn9$䝋hgr 2QE Wmt' 2q~G/-FƘL q&S?>}W궹L x٫ݮwt9X%[zW'%GC*uC59{{CӨ{sᗿiYXLoof?]v l(vsR(Ȝn",G=2suaP^4]$Wo$pDIe:go!0%{aj#\%`12094s2&f :K $t p%ҝR ݩtZelܔɕau-e&)D—<8VM'J:1qʥ^ El;, !!* uD" F猈wb#@J*yQj-Ѣ9cwHa1F$ v}\kߣuBH{8ٰ~Kg%5GPSFߠ.P_=޽J:y߭vFz_ zVz;x!=v.B0sRfTϲ ?AclA, K_camLr/6Qd]iz3<i`IO l&擦?Y/fϫ=/|aɖj_M<*z=v#,$ÅT$K^IdĝjeGFґ 0VZEck+AH)o`\5V3m@|H6_}ՙ$ cIOg ,KnkRI*&ITFN323>( &P\)[ZۊU*FcV LE-[)1$:MUo۝l@:&/ɰȋ@ia*D5CT[&)VZfL8X!Wsk#y J@ EPWu&3$|">V&+ΘhEq]OQF--i}O}HF) ޝ(1iԓ{?-q2=<[,'CwT8r%Rſ|%>1 9\Ch{^,/-uԨhòց {s#fO>ȨNJlk@^q d~ ƘϏT)nBB Di.dj-ӆQ-ʖʱ,k;xDRڰTaiB_OtkS8W`eEvS`b]B}ΉB`g@R'S۰dG4yp-B}bBpUEr#ȡδKI@ 뗯 I9  xJAX̣~}d>J@ߙMoٌieX1c;'cCT&%{>F og{n1vmG!Yqn~cFZ2V 7J:ȍK_٢.qA{0XءtԞGm 9¤X{:gyLgm8ovMcq-8eyĹld|xT_'J=LИ6,%)nJ!0(u݅om g%OӜ^L@ etJ8-E o!f\IA"ҩ)T3nP-5aJ"Jn@2%# {l;?:S}{Wj:YE'%8cLd1Հ#MwhزdϺsf6!L8ܭQc2DyҲV3H wD! fA˸gCj`D) 2)@34K'8ؒ2 lS& e:_NDNTwD3񩺛TJ9zU-ۆR$5ggL>4hJ*\3ǮleDLh!-Xq(r8HeyYv6h(W+2[Fxadnw[|-k1~FX/6kC({g`V%@m3Zݝ̉Ew=z1'Ѝ:r}8={tڗ gݨVDk`jZ&d@ZJ*p2 J1@VW EN1==ۑk%)l4nP'/?[`s<ߓ9a{4E`lSC>_|m/{;x ,ԓ#)"̾[,9[WWWgmd*/MR&?;[ 3(|Z!tvw߫׀<3 tOr0Txo;GJr>{ւԄL,iG)Tr$•+b ­!H*kFp u @-4#Z!\+v@ aHy6dQd٬T h>xaQE=}.>ۆ`hk!'D)gYTi ؍CY(uBìZmƔH9Bj:bPdM,[@ ĝ|.?̌I!d9>^tZǮ'"TBxYBA8 ! VYZ q (ҢuE UZ%0gֲ"aVH-F88XA&pMWg 1\;]+G1;y~㳍{|t;D%챘rh&q}QY+O㝆(1bx|]B"@~ /n j㍂y#X`( ky Z֋(֥>XZ/=KU s2 w囧sq? ʷਔ2Hoe'tv?KKo-+Á}ZZ=#ac]+C(%axc/|ܦvD0|Z4Ns/r\^rV%ps'p uθ1L4M0Dԭ?Z1~/73X}dNČOjZn}zFUbM ^;|r4|=?G|:] &O?4#Uvrш_ y"$S ޮnQhT bD'u6m@[n<-sM)I)iURIVI-X(dy,܊[sřM@ag[\=9ͧws ת&ᜣWLpz'A"$cC(P2ZH tiʍa^>YçNjg^/N,vsyEͯzk?(Y^}0ywN&N>r[yCɇy&WǼ!}s3@q?W SpTF;_[ V܇9*m뺢5k{8X8F1UN~X VPťPClC԰2pIPTqE!DgUl D9)rrb(G\3B&HJj 9֒B nTk2#WmdKאX[)gdP= *l,7hɄD$gH@u $ŰoP^8U>7>fᜭ&rs8݊YpGVOi9w5! ?W>YO_O=c??㦡gD|eb MƲT7 6xp\o./|\ /OwwӇۛK[8Y՟;wji<=<=hs!@ JF R9![/%JE2^wGZG7&:z4]AKSNfzts~6UWiYnnWM1^׀a<6e5 `0(=26S* RZm8[Sg8;ic RgѲ7-bB##d&[ZIwƵ8d=&{ 9QݲOD?̖zW>_.1 Շ9oɚ7?j~G8L8l0"Wnk3qЂ'{ -@A"ܿцr'ccI` 5^R (!C#e^A(t 3` NiK⿒S>[ ( TawLR ` !%;\8˞r5s &t3Ij;pXv,/k;$:"1ݤ.;\رorH;™WpLk%r,$@bjiKs BUg\p9ܒ` 09rd] L p9r\ `qz1>[gښ6_Qe';T!e{6SqI&lf)JCRV&%@lr}.ݧt&("8't:'8't9$,L8 8ۙ0;8AI.q9Nn&P]6m7*!&Bw{+zȆX?|ٗLvdꁻ{{mfݺ?QYi2|l4)ꨓS4|QηzG>4o@bnvji4YcQtۧb& CEGU.p#P^1xȓ!I.I{uYPt1h 2^^lY3Z{3.䐡<&k\l?W٠a Pwv\*MRb,|d`l<3q%8Ս[DdEWcjn%SZwFẸ\obuF~WoVY76_|1@*2$UrTI*ڶxWfEǯ_0؊" CiČm[Oq-t~3\vےEZ\0S0r ihQ,U*Rn e!,9 F-EZny NE"<-$yx'|8[4H|OѐqDZhK1bigGXyƸDpzȉj/辉1FX ֳ/`ϦyRignL T)s^LK^Q*kuxQ߿"|xzz}CL)o矪 櫶̗/c̡@M+Je*a-cS0gt%"E[hDn>C`Mf"geP+w{ mFymż )HOYR,lwH~VtGQ]4aTFzR 5fE3YT}R^ATH.7"1~Z;6۟&-QEzW\O\elYoJ^p̨"=N%c҄b ݭOe*5 uE2C#p*Kу2~fF>1de$ф1&)@i(0 ,VBOQQ RTOQaV$ISp90wN)xgTe٭8[C,-\2$姠S0s ,X+ -2ՕȧW= BۂyK T,X0D)p'2a~ĕV~LI_6,¥ (Lb58y{;sZ2"2pB Ǿ="%~BFJ`c->ARָG3V9Ar4&DrPt'j1.Oz4y m-#&DP祃.`h I0hoOQ:bGVK?ٮH*o"(Q.FDbzqHp;qR"a[S(Y!@B TiC-la)*oKnyLB5'>;bʜ5p8pMhQ` ,*4yKKvg: ?ڟKijjqNQIuIV_ф,l-.C*{^49NH5DgRg[A۪3C $>2HtFd:+}7c%6z]s5\ɭ"b0nELpmmI1H9f֍6 vaJvJ-C ?J1i+ *K/a1k&<`V2cvjcǐDw]ߛPd @Y &hQ"I\9u@KIPI iTH%|KdCtRE4IB13{갥`X8/6N%J+: -E{I>\P{qޔ#)zb7&pJljES%D aO`P AИV"+#|w69@nÌQl]2W_#Kpśzs%" qx+e_Ӌ՝^[ï~\xK(-~_\ˋˋr2_,x}7Nf_oqs?Lb_d>;7p|7Q}(6˷/S&#%0ҽ)0)T2cHHZj)gIJ+0["*✸MGzȮDuSr* Wc k#/K VRp;Jq^"O ÅܵF[~O `太!38RhGJ+5*˜F ^btR']+UDr Y$NhU9.)ٜw֥a ekIlFONj5G_ˊ1r2/w*)V9̆uxT + d\lɞ 5uĸB0 B֢RHw6/xzn{t>)nLb"``JaBBJ1Ó98#/zj)AmקjlE@ ɠ0QDeWJE @0G8 6iΨ&/z &a:=h @2*Z :FRȸT UfCkfOGg 32m㩉n!,Jw 2N'[#W`$mR` QFįJ/_|_~e~։;I \Ѵl[=\]޽ֿo}3;2B9&\td44Dz7GPg|kv\#b6:{;xa̘+iqU_=/S N8Tp=ȏ?z&uDxFɀbʑ&KP6bhh>>0L7_37<;}1:yA5Dl0 G 9.V8BQ@0 #ԛHѼDTPga#Պ}#Mk"ڼh #:±Vw@S<>B bkdA~ˑvVA1M!ЙFipXp}tФG2,ؘ´t{ .9Ga+-֘´q&]I]_=8~X `8F/&3zp#_c01~-Ճ8 Zz+Ճ{A8xc|;v'ՃG8Z+zk)Do?6&Pcfޮ}+CQK-^!9tWHZoV ?&$SzDεʠdshõ8} AzsE*'`NF !Y"ՇhP}h]/&X|9ϻخv9/G9zrùZS-yI&}.NI;Ix9>Z]h&'{p'} lr2^RDpTU>Ie \Nz Qnǀr,t $/A%]AyRcCw1%<$騖صQR-WS͐ԴW5v/s`qYD`T:Ƴ.:т SDM6mFQ (C\ * :Dv#8$;MSj.DMł<):ԕ Y*bD+,#LF }7U9: 4鷢< 85v((R6X7bbݙBIv!ʭذ&hnCħk6>9a|\_.o} !LVOŻCuXDu&ĭL!XD(!T]{.lk$Hwb<1cg]>Kgm·h.=@SN'XN`E݂{1L u~P!Bێ[z(3 t ^𾻏i6\=?i7m唂w3 9A>ywp8Oio9j@Q_[p@w‚4 *ݻiJ?btY\cRs)םY=5p =Jpk.0C)/5C6ɇWN~1:Ė(CܮErϥG'%Z2>. 'rHْP9&-j)۹ppv$<"n^KߞK'p SȄ?/cOW>F@+&48\ =?>zx bW:8ye^1Ѥ󓡃.(sIӆ;+=A >٩,*]/ ?Q@Ęclf.w~TKt3:bTTh|];ATJ}UQJ=TID . !3KDpf2bR6 8*Z qL9\|Rw=K4;'& o/$R!qUȟ_ߎ$d9/EaSJaB<H1\#3WFGgZȑ"ܢeY4$p3| ~:ȏdgGd%uKlM=&'[⯊bTdLnʽT> )bp7Z]L>Xʧwajq}}72-g>}xdl[ӷj=??,vL ^u<+!?Vl&<r&|&?MﵹQ2ruNz3ʨuW,Ғ؅,Dɘ97E>' 9qujPuvS+6:F6>X^by͇ɔ)1&HM`V9a$<=_ |W'Pbƍ ˎvafM6n 5!QoTߪpa%z[0;g4V܂1OE;B9 ޛ &ɔ]3*cLh%܁^m[t8K`*MxU~5b9U?K}㵕M7Ua?|O6 caLU>!<)#/:Jia&}qg=/%}5mu78*1"\R{&{ZA+ѣkjŠj$r^1r;8ıCg;bY{Y@_g+8lg%Q&cTקo`,؃; ; I}33XA/@(ПYe%U//2e?)e:*LV.20x-F=ٰh2WhTw#m0Mn(g\7&߿䵑k#F&2|JQqKDYJ3(yA(N IQHDpsf>J\_>2O@We2 Cl-_A/zm<,8RmA[,빮{ƚ HN g%ToYrB%ZGi ^N8KK9E)Z A(R\o\ǁjh=8u4n64 m:WnY.HԢ1[&w˳͂j^b|b=! #PW$+k;!bKD($wK"sӧE#LRVV2:s ]l;`n#` BG@c'NRV7W'V24njmݪЃp |@ݫpbִ`>:XtotD3KÂmT^q-8dzf+nt 'WV]Kͫo }HӖ?/`Wڰ#{nWWݦ/9F{^3j^N1! v3'a{5 S{nڂ[R׵a>eݭ4ťسsVX,+?TJDN5+6x]hS.z`SZ)uh^}HȉL @ #V8F+6ey*m:n=l 9qu)ƃ7+6:F6GTýMv!!'.N2%!vJi!۴9b`qVm(Ezleۦ̓Gv2XRYQvnByGBk;,)O:M@.,-sL_+I4D{є+*5M&J8pLҺ㶸M˫Cӄ6Ű]ˋe'ƬǘװY8}2.݉ _ʍs}p aEJ41`V)M-Z]S.Ʋ(BvLi}AVTJ _%OG Rji E<HD$JU"yUh*YM6/-)gAa^khys)pgjLMbB3BGv5ΐ:FVor GN5>Cj,%P+I?zЬtos~q0}Rk<=/..6\}ie(O^1誖و[sSǏ:#O@5Lzk#QFD]W%ja"IR)8c~@eXd9$θPiYTy,yAxRYHr&90q5bB=J/ ZltUG;uZd,M7|PhBj4(Rb$ÈqHPrI%3%I5p№ T%\p ʩ0DLK$8F 8Gi4CX,Z-1և8,Ng<%^pi%JA/L҄"aE S`SYbRptP^)aDݧ߮S rD&<$6.oOY#%(z5$?J)y_o= iUT FB/|Ɏ_&}F 0oݽAĹ@/IH36Q Pf"EÏ=@:л<װH ֐)^Vk [jk +ǜR*Ƅa@6#Fki5dTQhd+ּ!lBV917sTYmPys$ۄx/-?t6z(v6~fzGWB)r_nyt{-[Vgl%(?6շn?m4.}5w[0FȐKj)5כ!b jgxPm0m eg -[=}S'0)jٺ)XOP8*0(5祩P줩PKBap }2gOvL<@hU)u9r8Wu bz4b1lHCwW89CLDHVLc@` iQ4 ^0PBH zǚJH8p&621f%彩}:&"P%I,-.mSFwp=Fwzd-83֭VD e&pC\@#XQux4ÍS" }Q(j;bq E$'4)`^ *Gcv=LIQ풬qȼrعKC@p_wYaCgI?`?9HÀ[P?IB;Ytk|0m@qz+aD2TBNJ:hAawt'aB;sv`h%̯C/L&Ea(gњJNe geȜf@pwO$((̘O| &+m;4:TOHJzpv?g#H -LgT0|w"Q2:j@3M)z,k#<hI Swe& 2 Q# di#Q YQ$UI 6ŽU'F5so=tVc)YEbir ,P$Nb$Sضxibi1L=|wSX-1|`&YȊ\ⴈTiQpTXJ9߲@<}Ç.!9R\ {wo}$`9nGmJX0ơz@ϳl|=|Kx@IR5iQƀm`LYJ0+eJJtoA:C#*]5Ҁ jŴI7(ijl }ߧ~sp~]kdO` Ė'p@Ub@."rJx181Ȳ EWh}3 =b( _q{) m9u >Glŝs4=0(eJζ=as|*_)w!%Z[-AuVu?ShRHF^P`gi B( S+tՎQˎP\Ok+H‰<t) ];E/(4_!p0.R s!>>u6` akcrrx-@EjE/CgO9>Ts]{: p/=Z5x^[,f|#c_ &@$Qд B`-aC-hf_AEyoA0$y]ې][oǒ+ Ρ=JlvcIZRc[=!9fCeE6kKwub,Ž`F6ϬWFM &L:;d1`|5^ǼېIG?/4GZlr7aՎ1VLSSxzƜP GI*o| ;MYOu>*m@E)iDl4tak߆rS2.jZ#%AZ%AZT|MpRpDi# ت C^Y"LPg|`,FO-7܁$8wKpJEǶ)hv0Bƀb_3œ5m=Ū+!/jÁ!\^Q5Hƫ^!rA!*geZL@Ly8L0|6vEͱzbnlvW/珞f|| p1 NZ EX"|!GBW*5 A9EJs>._ y ƎBbh'\ ML@:|I#\]W]rZ]w?]rc)zOo-_mё޿zi1Zx~r'o>E16F& ,@TȦqHD6 Qoc:@p f&BxOwj6#˩Y5h԰#vjZl$AsjDǚN5 `/BQWY';$h%TTMbD5lN2L "Z ǺN9JjSvdCّ XX3㌬np2mW?/{/%鲗sW5$YƂ0W?]埇kAY뚲)H|BjCD$(0/-hp:X .ib|-+f Rj+q6 "#K4`kG|0zGBLUkh9b1h&si rޝ57! .wz$"q=3 /Aʠ|e*]rarq~N-J1G0[H@iDM[R 2Sr8m|K &g"(t_;6kd7m6HOIlM5k] n>5DN^mGٯLN3"`Q /F613d25 WCȡ﹨0'ֱkΜz!LZtr\^jĚǪq?Tc+gϤ:PW㾛 ՛wWKޚ!KBÐW(K&|Ձ_Yњ-V,5Er6jjceXL}1Em U{o+[#u8`.UhH>kn-rWJf7̀T={}2f5De*(xeL@uy+)o ̋L@V* ά88mh# Q=ܦtAYedeJ0+OJ7]~?Bt}b 6dK7z Jgb~~Ǣ}&wۻe0ՔRqBIίͯI0bEyf ^4Y:HVj$"]xOѦ=U1cC% Z>aD}ŕFɭ0%h83u!oT^5MjhxLm-Vgi%?.BGx0Et[T gQ0m1^JˆEeZA56О/>O jb/xx*%˯uvZ uU݅>*'& lL?)ʙSDzyf% hUGuxIZF=US([nʂJӁ*[r\N&V季5:-ZT2] A~0qr@yAӍV;f3r;%8-Rв8DhaBQAj2jD]Ёq8kaxbhUGbTktz٢AX1Lvz)!eWU.q=Vڠ1ԾŠIa1Ծ{9xjH ku!!߸Ȕln<햋Av{9$TivBBq͑)UBDb p}vә>xڄWLP.f,3l(eŎn:Qάxf f]V_7< dW<*OXag],W\LQMv$j:lC=%Wԇ-Mۂ|2+pr*lA5 'u|t q$w{Šr<THn./T"o<Z1ELnpB1P쥘#竵\k:#b=SEs01?2'̨Zʈ%^1P)T4eN԰QF`ƅ [o:ޝͯmmOq) ?gGjXHɋL }+E܅Δġmʚ-Cv6@QĊr|?k9;ͭZ2VII߿=ɘ$2P*:c^aUġxy.Ckz *,|]o ܂7⷏3/۵Bt5G;z85OĚ_`.mA8L}%ߝ)*]q[fPX`n:eAgIi{1ipW$$eUZ0}jnSSFjg9OI2ڊC6٠%9etuik&]2"Th+ $hwŕՃd QSSҢ3Hc] 6Ҝ6 7jA3%*iC[?i-X(Vov˺nZOq=p"p۝yViDA>Hg@WpAU(>^xƑR|OH4z.غlpG`#Z{ց;r[Q!`)PX(1<`Bc$tq ds\P $ee_J~"_iv509`F@l5śCKBxvhi~Tho z\px`8f{Y}C6)ltJ48 >[~wҲ0~%֯ՓOidi6sS/tUż$Dy#=2-`q@Sht$-!xB̃f4V=Q 1U >$puV 5>xzj+'X};X:K1` hCෂZ[ +, >Qѭn G=W[Lf)@0p0 bjPШ2*G2p%c6Zbڃ#)uțcl.䐚lf@tZ0JL"mR-㶚oUn:Yzsd͇~*uxDltH5|eA(0_>Oݷ.'0XOjU\>M)ЍMdA_A-ԗ|pWH3 `$+:ut_xPdƃ!no`q!!Ro.2 7w꜡5pfØ岧I& ,J[Ԥ\ci9FJ "ECIߍ sV4ښ{Ť|VvyNE," n. - (xzwevI"M5Tz@8rUw7nl3meeѬ}Z*rDd#,'D,pN/oI}3 ,0ccFMDD8* -zBS5u^s/קia2}˧>wFIJgqȫ_Ewܯbu l&c~ 66lӯb8֍دB ~wc~Tj!=Q^>~pf#dhJ$mcy-\WpU(҃*m 1g f:|% Yf{T5oc¨,è7p#s97LѸazT\PJɅ;ؿxT<,͸j!wtWpKå^QP]nz<uz66=b%Iӌ L6`6oL`/ӭΰ֕%gq8(ݽ>u],nx}_7,a_᧛"a"Q?߮x7?}Ʀ"U ~2S^kiY:J<> ¯tMzZ~Ub%=+Q/p_cj8b! y"#Sɗj7]k6v EtrFW"%B$jE4K$;N6BaΟ|7klV(B:لM(& ۛG3ՒRTԊj]Ja$ )7%YA q.p á"^9DՒ Aɜ3N3.I/S9]XnϾOW%nqVbYӷ w*w{vIC+`Y~|gLr=ijx{l1e¨~DCRY%V7G% 8%õġ) rΉg7ws:7m^:7}YG_ĸiu\l&cj$ਖCGRgDzfKE:X)hܼ _os򢓀H ]϶ B2~ TZq@rLlo;WovHJ(S_:88~R:^Ifo)_2c(Z^A(0(}s>AP/ĭWpp>8zMWU#CzhR#wv9sOfztﶩ洍X>AJf3ڔ y:`|DTE4G8ӓOnJN7}F6^"5T#>v^hv+!!/\D˔FILD 4$>3ȧ ^}?ځίAA_\Ajf\y9׫k`LdB$QWfj^+IWަgP̸%|(g Wbs{i1=xr~%ZdSΝāŝ>yMxppR߮pOGONUShF{T7L qBmEFz"0cU8vPXha1jCDE cT>ZUuVbeE48^!*1Y͍U9 htCp]I_IFi~u$jm>1WsɹEܭ|z=v)3L13IHŁN?mLuyLR[)0S0$ ҀeEX;<Oޑʝrm)ʀZR)i%Ѷn$i<3Ci.4d -407@Hmb3QIԌVUEVqNTD$HUɉʰip̍B`iuVTH`q&bLjWF r: ds0XaJ/6Ǣ?F=iQ0?klX"lg s>tˠsIzwZ&M(wA$Q傃zBl X`*1UB`Ca`B%.k $B ,>)B@Ÿ {n`Iɵ 6nQe,>]i7kgGTĂ'?>8`LD᧋82wuCu۝4?yF͛lۢ- )VL(xlO.t@}0 )b\ő8GӞ}Z"ªJ:(+oC͇vk+:t4D+mHˮ1SRއ~^?BZK 7HI%b,Jn(XEddDf=4UYo:h7A'~jFAu q ^g'f'*-ckWoI99V͑{@[jB8*G5SEgrM{ >zF5;ۇߠN^ff"6ӏ{FԨ +$bOW.X?GSP5bKܓ`ǢsXo]l">drܚ}RVB۠د&<-{inE,B22) \6$F1uS;τN뱏RGCn+n`xX{G L ,10K&R("R,h.i\ZDKpS(7@}ɨ$ehӓEw';oC-w9)9ja£8u4%TPF0"N;ca$<q{Sɱz(ldHI#DVjZGG@cM%05^ܐB8>6F o+Ec>#BX9O(C jos&&XƂE?.f@baMWVpY ǿw&g'qנG?^^ioDDhF? Ylwu>WBwt"2;7 ~D:ڀl,7x_0X!N`'s>zS82I${$/ FyY\rbv_ Q{qqM1)JeJtI;.W>9wT/lGAIGÔG)W$-<q)/Ig6Ч3>mK ;5i8u}h"v}m"3N=&(GܣG~9us{ G+OzORVy.8 M'gr|vino$1?+[CmX}o8 +Hb hj醕!>o`ה&G@2)r)QyƘ܁whF^` ]6Z)TɎ"N+džugdO.tu(`u(NuVD) )Ԅn FtmOCs3o6&.jTZەqSֳACO3zTȭ" |  =+, nΪhx0;NGߍtl"t5u4.;SAj~#u͇?ǰdaqkY}=ôKBL=,$9_nƣqr&$w}~)ä;&;{EaUA-F-7y(> \M Z<R(z D13'R1DD*tF07hI6(Ӫ]1w#Qm iP2əkmi$VNB$n#IA@<+PLuXveJcw$^ޯP-^{6̾MYݕ:\qa_NΗCfPg)O:O77aZ t6"lG)偲 &57G\~:놺<8(Zs¬~GEtwT(]%.iՕ.kYj]0zX@^fM;Or7s p㳋p=?GJŏGf:4Sdc!яQLhrOlG?aIL'yh\߼QqFa7PR־:hAY * G0ApUJ!,`305rnU7"LepK)g2`K!A"r_Wk76<8VZ  t)TMW fK]rG׀yjMR+l=y%gO'gM'aJw')RA(}}neĜ<]q Э!b[}N7i=,_1gC\+>p zuŀ> 1# {_$#?z \OFW-8|OmUHH6aqH}H|r>P}Db% n=^.R) |^1fRDOo;p~^GHj]v.aڱ2ń-LV0JHeQJF*uTAIg׎-}PQ҇>yŒ1#4!䤓<I`)prJ##,Ô46UL2p!g_ZA87b4h> mYŐYW )cTbi[/CB;a2$*0ǁ"VBj 1Ni9bʪ[5r[%a, ,wxi \AD4H u<8q#0v糄YUy:s}L!z*8^L]F EzAzK#I 0W cIf4JIѨxfXibwJ{~jB:HWX1ӆ#,f@ 0=@zु#X%V2L XFڲ? {SA؆"Q@ ֢%ڒ}][sVG^#к]H5RKr?u1i\Pql>%Rh}.sxTSEBlw &b*=k,QR^Tq]1US\FY+S[/7 I^a R%娵ܣ`wmWiPJ5:aaB@RZG3opHsjc9 60qE svLXeȀ G6I$K䑉f3St Ź q+R?@G-AWP%`I-G.o\\&ӯpoq1ly>Mv!%QF߾skfvnE)S"eT1[eM~hȺ|GkȅR:2&Iy?$ipH o[*n[2B;z?!ܶ%s2:i[ҎR"{[oK!x*!=7i[~ٛw%8hXyCV,FŢuxX_._֦8svmg qf ?8hůj4QMo g|gҕs)J/sJ^,6Qt@:WRajr/1ox7M{mT77w% ^kx@S>sy<5Ѩb*Jqe+b(ʩ;;>5PDۺ1k֥-8C4 iwgOL"ku9sfkt"KTDsA??~4{yj/SVnwX9x|Y ;ܘV8R ff "w)xuܮyHYY hS7z̺ -EI'>MʐqSjE aCm{Z$&L !YYs]@!TUZT0 Q)˄P0p╢QUxmCVSmx>6~7hs\d/JFUNES0eߜf/M)Zi{kB-O wF YUSޤ흽t+"6Z?/a&,;.6\n{U%·f7IE/{Nam١ѶfW*inL[m JWV0p N!Iy3yR{nU5Xͅ\GS+k[Ţ 8NQ(Ř1+\ KB&;6Հ=>,LJ{eݏ+#|{ лé:s]@,BĂ7}7hM7k|/Vo=@^ vd( ^JoH+14#Aƛ-@WJ[ pG\WE%?HHI%_{ J^lPJ~.hIkVI1^0jTs oX$sؐe'y84=`vk;*-u=`yvky(Ѻ¬P{-ʁ42mK=7N :[7+nT(8"@P@@?u:{]1(5S'4S +RjP[EtŃU ̤Hp -s1XvK ?}ܷAaxILm =`M"X"LA3,(#@F &Fjmy JzTSpCsTaZ9 %E!D.hL0+. l9sX`sp" CZG8`7DTI8J-ua)T]maY$Gd,a5kW; V9k3@ACeg:c~ ln̵v f;[Mɛv9=|A?@l wpߪn|8c:y؉1;dh{H?֋M_=H;Yg=/ Cg7ЖF|VOիO-!IE^bņ" .:G,zNٗ{//>4]muR v9[5Ш$A͹͐{zy%B3RS˽gR_v +L[vfgg0U-yn6^{MN)zzu_ۏmV!Wapz@wgN{ Gy[9թCg%X9.t¨hD^#fZ;,z,!]jL#*W%T2"j m*Jɓ \;8xaSxx0}W@jjjHqmF ˣ)x\ 軴9~N!给_mKadәhxI)bѱII#Sn"bC-m5``PL q԰ڪǓM.g8Zj8tu'OG-wu q{3#G |aNBc^ D%`ۿ`%RuZ&-/]惋. K$_V3,2ȗ;m9\b<8׶FAbv]i7R7@fo~ '8m5%emFD$Dt5KeRbҾq j=/y{N uf5"~ <ߜ'v*ȩKjTt>z$Q)Y!D♑42]Z1 -k\080sxb$ 8F`UZru/׶klWP$ɣ7`Lʸ.\yǒ67:mn۴ X U;ݬU2I\MNR$o(~2@5=3CD+,phOZ#dn0=I^ObE@B0kCۈk7յε@[`Ɔ6 {]7*l5ECfFW:;"ת=*N.&,L0F 9l= 07Ƹ(1р]DVBR]䢍p"["f_{+&CE(z.DvŐi)Kd+F d}.3sRF "6!^@&4M4~tPÓܗF$=: h1QRkVf!RW>OT:(J`lKy'o=MH“`c~nK:tOa2t0&TbKn}2:̙[?}sݙ"Njp lܔC>iMkg!Z28Vɛ֍&.!XR rN;XY)YrzLyC`L1p{iY8ƻ%ٗ֘8oI$kE -m1g[U R`(e2DO4QOR -a+;'Jךt.8`Ap\Ksf6S gLl$EVΕ5zyĄdmgف9-Qoyʔ?2(jzSԶRҊI+卭r`J#EKjrry ׷`jpTzJ0!jTvY0̈"ں6\L{z4&ˠ3acX% OKw W '}qHWV4g=1]]lqBPG_=+;3opd0\_ӂ Lhtaon ą  kGc|sY\?9EW$G˚a'B+}Gb΃7Й Rf=Z䓽z~uG7/(Z"hRSn2k4>eC(Ue]7`,94F'%0ޠlN*p4amed"O*eަY{fmYh*43ZpY[$!ASF^TkdĎ`I=l_R>̐AތާdYZzШE7E/Cl#)4MoOa3|c|AH8fb,f ,7"԰H8%&RꥥX AUl]#j5Mr|[V+M1=񤻁Лdeg5230mΥKVf09y|[&aWe'5NP&X#(u#aFԺ9%*B%tP TW&p[IfxU,B\ZrZ߿j`F,0vYjyu<5$ aD1= T%ՠkf΢}]gr!f2í7\LW0Ls&h5EV5B :-'?|GKNIˋ-9iק'W V+5 U);i?UщHIT"#qN*$ C<'qQMI+jfXiR LC r0&((P%XQ.ks"Q,No6v:g(]M$GGyn: ߓ߿|~U{`2kq}Cw<|gU/wWx7L||B7߹7 :yZ #<%kzۣwkkٍ.qnTbѯxD8֔ԁdB$9)Q=}*ċM~6o~ ;*y$82 $ 6iqΘNX X}v*D鋊R!J_1>h8mS)~VsI ,X ddب@q(J3C㄀sB2薃h|GDK޵#"`vW20Hr{ .dgln4ڥ0`Tut|^7-A5:yAT^ęH}4ͳ1)Ս/5wOِ8C)KW4#6`:+K)﭂g Klf.ae>aNYk'aڦ{ Xƅb?V,\ppʏ=o -O]A6HH.q]Kg%|^cfuɁ՗s lKBֵJ?fbkq(L;G 7ln]k}C&H* fcxצ $mUE$ޱ/mU @1ALćɆ*x Tsՠ"|jbh0oht,?mP4SyƄBAi?jaO  T-rF'X1 (PJ$Dg)IƦF$ZQiбȉ^j-(03 ;bx*RP$MBDJ˘ʐN)UpV)甥QdLO5!Bc~js[b*v*?dϪe'C`&$䕋hLQ:||@3b":sTn} "Rmk腆j&$䕋h}bɋ0ׅ>Nݴ=ubu<\ͨ/w-B>r(X Ud~%9 9=I(otq =~,`)-j[{j0\8fkOcaWֱb۩6+UCjh>9S-Ht8bإpFsY}ڬJJW*M;g`1M%;s#/9=oテ'߇$t#1C_U[VιXHE׌\R NtY$**~VW*c'm"PW=\ H<IS:x𐆮CE&}/ZZZ3zo$Oh.+*:#eː0ؑ$E<\`EfR`+8M 78TTzkVզ m~ni6AQEpLŗnMH+(xvAhX BD'v*ڭ~-NvkBB^֖)NT+;~Od7E`L%;d][4?IX%,a9@'dyXd|##]W ˝[ }~|j,ݲlt?w^A.C#L7ɊQۡC X!Ixņ|P łFQ<Ѣ4C&E?$&E9ͦvd?T {uyQ 8e!(+1ϲT0i}s)Æ4^J U^25!9Q~[ _eq,VmoQFJ9ڹ O+9|e{cUT B()Z|b5o W@=| mw ]K()k!94'7CqI MzQ 3FbBL*7!R1k+chqme(vAL6-ͯ7H5 ;.A夲d)H`b2_V<^,sW_v挿N0|Ӿ~p;[TL{0O_\]3׳irƺ^]a"uË;Ol7MJ%'u'E.CDA˛s)XÌO 'Mƌ@Y&q0k"H $ԹV9E >H,8CD%H̘JY1 a4 dL\RBVL"4BQMmJ7TU0†fX:IhFN$"(Ec)S0AXhQ=6'c w* w*Jӌ`Øu*#HRJBf`: ae>aTSAJprf3:ɧ1Ğavo4u;F3*CXq Ɔb#XP2'θH)bPK0#h ozwR^"@:lC-S9/#މ0e5؝Y3S\%s r3},3@k@ "V1ʆ*WQ=,u;XCL]O:'XLFS=+p1i #j T k5i 7 xN* G*&磎e* uC^m QMz!=R3% $7AqBI699c[!R.|JRaϫas#K+r*-8k .aĮ.fq8Wn(J+ W fk3NtXJPbv!Ԅij1Aa'ckDQP%Zb8#;ha+/ 뛫*o+$[n)[_|}[/j~C~z> <_.Yt^>OVh84<jaXpez`݈1|R_}(>.2y6=Q1aGl?={W8DK5ш\ffwQH2`<1Ls%bW5s/x<L4QRjУW͓!t  b\2F樹ʸK-ރdjD KԺrY8^Kcd~mb3=F0ʹ_)-u2RR*5!i8hvw- Mx}߲ď/tҵ|??8Z~|j1G;eX::ίn8^G}}W`%A/R + s2=9W %>v6)w'Rqa@o&2U#eh,ݶ"RAs3Fk7|'ѩr(rFo+*#A8h"]LI=AVdְ} z}#1j9G>\& CD7 !\Dcd c>x|Tj-щv%Jδ[gݚW.12sŵ99 s~~ˇF.gL t­W޴Zh1$9.ra^4Fs2vp>{(SEtJCDJj!J[R_"jΐ߿۶A@1` ]jy Pz|]T]/g=v|+"/̶+ޖtJI^dt[LFCUJpT!BeG%.jr5tȖTbtZUJdZUJ*pc6ܾZʲ;]3zH; 䔲X@C,LB8}Z3<&LDcu U<8XRDȫULXRXJ(H|.#tR3Qs^ZKqOk7DI$b%6vݻv(zOdt }xrgvuCay{df΍&fD+YNץ˝[]w7id͟TQBr8^(ɋ/`о* <Ͳ}K[y~P erCQQ2w^J<gDʗR9?FmM\]@uމf`TFoѹ}eoxr4W'[ڮ@Cd1t2M۠;%_yFQ*vZtF)͵ikĜUS!OxE#=N>~i0Dr+F8dM% =iA {uwW{enf !C55Gf0/cLZvc KA ֤C^3lyypSn q$@9%/vӖ;@KЯ-&p@!EQk^=am[nRjO'ij+KŧbXN)՗vE"4w&Kx~|Y$_?x.d%>ͧ^:gEU]}чɣ[BqFt +Jϯ $$ aE2r- Ξr-Jp7E)_&җ0%L/͉~}/\*nK^!9JCм*Ÿ q` s650CJ X^=%>N>D}?j nu 3ֽ17nm["͘h@c֨$VXJ@L)KbSZһx,Ɋ)X3Š~VH,e3f* 7PB%J UL_aw )r6n1P6R"s56j1FoisEc<;v$*tQϴ6IJHY=/ c+F (kaYfTb:[ob~S{|C 5\w,C\U¨G)xP(wQlƔgtB}Pޛ x_+4|csAOM_F cAܪ')I0tC1^+O|ӣUOFlf 6;OZ$Pcg$TةUϘJUOe?fHm} q~](HqN!=|H&Oɵѝ_ c;BT}%/5uIL0\FTL[m@nkz64YfMƹs`?p^I`ZJ -bl8Fl A9 rd]OI< YjdP?zHe )&*.5dSXࢁy[,A$Ͷ(/zA,f$ jJ{Qyv0CD4[+WisJ~`&9!_n_!Sq#Gr#a!?)A&_nqaޭ/1){p[z!,76W)'ݨ TM5eV]0Xȁ o(5"~{sݒ] i->4u ?Ja1.gW&か*" ʁa|~))@H+J&Lf>:a5`j,ż(`Y\Tz)ARxSF7C!{$?YlGe.d%W+J(3y2e 4̅?GR$=69^\ J:0s\C1ohtN< >.]sh1o?x@GwQcCjKr BxVBBR+u)AYƭ񒫽\ *5Zݲ$%Z OȹEVZ(,58UBr+9MR ع9BA/N:] jʲH g%jQ\I]2_^(>yG7Up` TVq|{:Rmj_f/!/X4pR̸½͔|MлAG b&Bk&J-+Fm(leje(aUEҚ;ˈ*˨dg/k ?Pn)T)z\K̒RGWbB7)s [PZR&?6_Kp|n3.QFt3Q-Hs }/.r%vhUݨaǖBVemMHi5e>/A XKC8:Dz%5#qR32(E! ˕yH-)"{j-yvWphۨ?uuwOƺt֓B[ZVj)ϺS{~tWϋŧq GY,>^_SgK?J]83_܄ʹ̯jeIݝ ǺO0u9(}j3*Brx/Ű/4=!m+)Y++ BqUOc7{z/V5^3PM/] "GXz>_wQk.6וJ`#h׫>NvW$^RɊB}j̯ʛK%4LkeDqCq*znj]My WKf^]._sDܕf6-/Y,Ʊ_ccκd &L}&~?mwq h?Mnz;($?z @)Em!AWY+k,s䈵TR'(Q:cC=נ@J(xUZ*XybQGiet 5UhJuNvL"A[#ћ c' 9?,RP6#ZFO1ez],E)(h,tŰ[8jkMF()d4@%^-q~Q8.QϓYؘWj}Ə|7˭Cm2ګю.Bw))(TKYLQ:%oЬc"96 ; F|qyB - nBqT "K6!\ ]hRȭI <&d_\$ ad>DVZ : 9y16kSx.qb :{ZD1Čl%?G7 LdALFgs o# CetUq{WC~<2JџF-I@Pc=P_ߙ|#Va -&Y,dPx$Jʄ(gBf>gNY_T'NjV%ãsc^\fArnWS*{C2V(@ gBgؑc̔;# ZύsόGOQt.Uc-k0B*6ՓeIӥW?'ƤP\^8c=ZuC6 FMٙ]~% uVQ5UĒX+^R @VFUϼIʋ莑 ^ZgwE#`Hiq*t-rЕ3Cc5UCRY+NC QLp$"lc%m4ЈvasncɾG Jxrmt)t181lVhsS *:U{K^ID*2X}3H+Wx~ĴBY]k:(aKdHn[(FےUD*f-}:qnew;!^L#vP( 7 [XոڽܯGW$)[޻(CH* g ' 4*8EO'hirxJQo@ص!ѩ.J{0#D*₢}m ND$- [m@ #*aN;X-8;ݰ |Zp\ܙ;0Wgb,n>}f!7UpvkDy,Hj=;Nx9Lh_r0X "3a}Il< dWvQ :⑕KȤK0ʗ}diK% xrb`/[{k}xz~;*ŚU#\둑ə7M"q^[de7N&rƼ{|% +OMEئ'"Ŵ8ܢuw4Al!jcQ0˒OtPM{'5IQO_: 9I+clx#2E)#Nx+2I3;7ƺ\QyDžt{O' Ӣ܌wIdDNqY2\W^[F=<:\ d:r@y9*n/{O%#p<fLQRN`ʀ0}`$Le 761)R7I'Xd-NQdٻF#W,6y4c44ȳkԐTόIRR:Ihd*##3"'& l՝ r0!QͨrT"r@TO^} fw2kr̴`^} Sw2k)_f 1y0k"dR) GاUc˽?!+oU ::n|0}DIV!8UĐZ)1*R4ΐivUpԱH+![BUCBvG7+q.{ċr\`#6=/3^p4.g2og\2$؉a\ċ!0:y #qVb ^u8 Eu :vc{4 CܣjJɳD-z=wAhdcQEvQDԪ(x"k]tIj :k ƨ<Ť)-0[9EC[l(QxY"NwGIp* KErL5"?Eu&ӟPQThwXʌb@4H`N wyG9:xGc ?>jT9긾[!FRܜ_d1b Nyvhz '7N ^? ($`+Ϧ7x5zz?qϷwai7ifafAřay7u'W7oOD 4 2CLĐ?U*j9`-M'`-3Hm*ܮƾ)VHzK8}Kɂv Y4 BhfɮZʌZy_CKQ\by:E|UKx5uK W {K X >6IK Uc { 8u@w7(e޵_UIc3%a~o?PvLۍ|o4-K/kcf/e8{v|jAg,F^&܏AYGtv<ߔzY G6.hs(,kI2ȹ W?OWО*xQN׌rS yn,ޭ.9SevSA̷jhwkBC^V)*١&l.b}giE"װjeEEbNW՘!;Q6B႞: r@F0t-%vI^CnrЂzRfYB3xų1CN! ;cdIe?;љ ihTٟ%.! Vrjkƙu88%Lr3dG0RGv3eK/K3:b7ɂc6Geq/ ֖4.>EkVpY]ch5k@_lQ˂tFp"hK}dJ@s*އN\!degXc}|` zޥ?nPon5= h&d:jHL6k]:  Q[!-n .[ԫx,ldZ#}G״Cg꯾FY44͏fssRHr?qCˌ[v{^c蚮俛l~ჹ>R9G֒.`0갴5 @P/ }F"=n'C|kӎv'ajP[al} INXv<E<72v`lӆv 8)A4Lt~Xa\f tety.˅ri;DHU$Aafb HNMH_ކҪ 4`:\RtDSӔS#S`FpZ0!Z(Pk*X(A-fI2O;H;H;H;(nTFR5ɌRHQ<Y:NNba'h<_?ޯCK}8B" (aJqBV@JhEy"esNaE6`nXFsf֊轓,Zx:\I'8bnEPH- 6m"M]Xs*PdkXK{&QqMDKBg?i1cE8pOF#1 e֥ӐDEF L5hcLJAIe ;Yl,hWbQOxw4+.)}y-W!0WG_  d:[ LW-!t3;3}׹/q4- kҗ_oV\~BЫE@MMqܼ"zL) wsFVLJt՛y{ U5cYLfwIGF^fBWߡ+J#ʨ 9`RZ$lJ4pڸȕCyWL*XY9h~χvUIɹNJ:BLuXӜfEq?<^j7%8a _Y?˒D8 wTvT$|gBk.`8D׽x2d*yA.3S0\-p? 堨ńebO!ꎎE)]0̔^h'bgiȍ$Jd',>V²״#7VH!Zuw8ky7J:0.f6`,>p-PAxGAS'Y/Mrb<ߗw++tȷ4HtoZ%X o%4m%vx-{<ގSi.af`}4"!lʜR,8M'PR+"&:P@Φ\v>!#Z>>V' 'Yjbd+7L7o l 1=i`^Z9"q 4&]-n1sB{hZ?l e@C>rE;SG/~&F" a`M.1cxƶ5 'nL@LBn%-Ho%c$IΤ_Utrt)m;Ⱦm"M{N򐄞W瞒f~MHw̖֧҅4CH-x nq+ϐjEsLjSeJW 3-5ʹSVZB)vkIxӱW|:2#)auڴ!;kAvϪ6B2*ŕjDJVrԯhBTaeE{ ;=A=D#](#zL-j o9`n90o|'G!6CМ_fa+LJμՀu=MX' 8e$J ȫn)dakKA.2A M'2e"]ߊ<=P>B`x}0_Fם-X@Y,Cp mK3y~4ZǮa\>=~^ق_zu4wpeu2o[gcM81DpHVcV__گ6ThQnd3zo$2)nGl`#lEϼS"xk<`<cQ RY+llNmIN{{ PZ85uEo6܏g؛ 5Gn9<ՄE9a"Rx͜Td BBabA!(1hONdef;d1cJNCFytrNۖ2]\i&c$ 8BuYrl:7_Ww6p`ߓ%//lXV^b&ooқg-,}UaƤwrubIs(\GEB =DZo4*TvqĐ+ Np%m/1Y0M1=4=$Tn!/ n'cкdz%׫LUfKes0wϿu KWX4v( cV^Vyr`yjkԢyV(?T(_ΛOlے-sGګוy2aXmZY_DsNZu uzNuJtVl|9-haS]R%bybYǒ7Q-X.U},oQRFZkĶg.]\ݹ+~bvW]׹АE ׯ$*W%Rd%4ACtXkF:k̸G`0I}7ݯM5~ )ERG?@{K)JBC4oOIejڒ1j8ՍR/"TC~䔑~R:h [*ۜ`W7 C[ 'M)5\(U5uT5ǝrU=]zr e>j)`1̝֩c11K0zNjߒrMJɡ=&I9[L'ْydXւrM)3}0[L'WW.3wݒz,䕛hcVnyy]g*COJW{_5ժ#QT(%yǀUc#Q1A*HTk+y1p%cC]A*K8(|k4x'%d`vy$"n?˳7\wS:\t6s-l;U3Ro ]S94m)SDoX<<؄KԾ{RjQ'0S_b7+8:Q:$^-)mHjENgJ:Vy([n%T8RzgɑѦ Rk+5^!"PM%63Ȏ1֢PƸs1p)+.>7o82x H"[OYe |_b}u(K~NZ-*eWRCnn|Zݣe=K3 ZBJ1ٺZHHv4Pon>R>w?CX,&f%q#E (C؇DVgJ͎߫LfDJġ(ɳχ]$89|h>&$8Lš(_+JhRPVιB[P(_WV k 46`NՓќskU{#9i u)ZTGh xHjbp ѻ!#j G7w6)gQ\]_fi0%( rSN׾ܙ-&vsv?|~ajoHڿʳq7Z6ރaK1҅t~=SFX5K#u{ޛ"Oay~7u:?s_Q93JE+Ȭ-hc!s"&ƈP՞`,qA=ʯ!Y׆'n[D/t׼:!\BΚgdƩo/z](3$Izc(I{8% =3H/n(vxIm[ %N^veqPX/?\n 灱R.D(.YYD%; =N8щ,exl;^|&B{V(,{g$c"$3-8&Z*[O׌y/!"Iy4ƚBR)~DF)C4͕T H7 tv=D$zHцUbAs!Z0Ds#yc) dBp_Pe7ÄrUPnG4Rd̤B TVP˙:eQWiܪɂ"5S,EeLP]ɏ(wqݘ&כP7˪>x;p(Xe;1h%XRYwW'嗼e eУJ+ rm(?>m#7gfnQb?sVO<< [?]rtϘ=F} Y?D# !zHac0@>Y ~<$Pp5*#0P؉ˡAX make ͕Њrt|#qDX3q 4#:?c~1J[sv"ƘoDk{Vw9&xne-AN@Z^x"4_q805?Iq`y4Qc\!20Y^hc 8s$F֙$t cSM Z*4Qq%2KޝO5D+9OԈj:@D U1uvC_ʑBG!Xeh2+z{vћW=DEr GAFeRM9,{GswfeOeMVHG\\X4w_5CJv .84q8X|.&!JcνH<7t,0GC\(Tra7h5*X,wZ)3164k0̈g멀UAS9j@}SG#I̓oI{gCy-rXA5zAuFj!qV{'j,!*S#T#5cX ^ZTP1ȵJ+6W9XK\0kF"Rb &h[UK%[oҥ iߴ:t\1 esjUR$:ǽ nJEI|)^1mtbV &R\y)?bjeTi`Q{c%!AḤOpH>2,> H125*A.Y4Pl{K)a*5U8[sT[Y17ayV;!ÊLFN[OZA80r &ڈSf݅=sf$֭2mI酩p}"X"[v-ӵ,[v>!^ wj`$<2Ӹp38YLI 1zXg;wjNia6] "}]`#F w:/4F1<ۃаzЀb,F*s]~/,ޠQi7eˇƋ)O?fe,M??} pop'w er,|RoA*B[-gYQpƵ#9QRPGh󱌠ooFrK&bc(*amm`Z-?K=Y~M76r ΞHXb4ooC"R#D1`vB4CÃ+FGو0tkcWl'G]NNcqd; z wXw۽RSUzS*m[Ē.[G=|5a1Sk!JL5nhn^νz:k)BYo 4iq}^`^kc4ghH?U"gz9aFgCB)rDm^7NJILׇɻ;[,MqadȝjId-uv)Zqu3HFb/ᬩRzJdE^u&(؞\F2‡؜m-vXJRVY몔UnUwa/ɣK /Bqu iJrEI^|췽gD'Л?T[3 dYGGuJBJVon}ݏ?`b,2*+p-Zg5;Re% sHՠת)gtQ}93x7ywJvy+o>"A8"^ ^]_O RN}OהM.d?5CŪYذSm_3ܓxS2/IR*;5dU)^OSuF#nݽ%.Yɘ.`{k3a| r蜴+Ǧ1o=ءW #fp|= aL8+:e_M=oV +#̤P=]f`J}YP-4 Ks8yWo0CKˌ!{Q 2"@/w”uLtY-t]G=m4$!D ʥ[U?}A`7p&?"I;8\]AzWף &8(neuR)osa!M2pBܘs8I³\ReLQh%quus9muoDTUH1*/'nM Fp?.;x|R) ze'YF8۫/f~=D^JzPhs<4aL\1ĽS񂪜]k,JjϝvnCdLI94.^d~_7HF|Y1(@^@1mSJc0qk\!F%j) 䈤+tr yvzU}Xt}GI%RmCݽNnd8ߒGc6Dʁ^w{HSG$ TgK~5+Z<QZTcGᩖa;g|h'ִqjM 1ZͫxifzhIciMYuVOa1Zbթ('XK5b\( THZg5curzϝ*~wp}su1zz*85Mmimra$%t$nu]JNw{R+!*%ZNnXj5h&E-˶ .mFNq[kw fI ƃvDmlr2!UuEyx#+}^ݪ#$-:& ?i :՛Ày mᜠE;6=o—w~Q Rr0^0G غmWlWb@[7%@+4d(Wh-Y+7$"rz߻53zTĘN;x3f_B"znnmX+7f{M2,Ruc:ιb[@c[MtǦ"YDM)z~<:GIƜiԵ瀚JP_k-(+Lk!Ri]ms+*}I}=*UګlR oc+%I;~IIC0!iӻLw?Fh<- Hbh(540+YEמy\%M\38 #^Ժ2Nq hjeqaA;jJ!TJV1HS-xQŇ"Oܷq1ҳh=4Fك9r;DXŅČFf3s2P>MbStSvaOqĽd܇(x:䪯WDn+!N=i[G}k=O_B*]V>T>n8S0ߧrH&$C_rW~)kss1wO~Qi?RJ]];M"w}0&MF ^k/*PW(VTǠi{V/&5"=^4a>[8;XĔrgg6?yf~_4NEw*?;YFdQZHg=aQtа<Ҹ;{Pګ%"HPUФᢘB\;tW aoCX*D2Qi"HE@Y)feLN*@U}i4r^F3'Epby8O^̯a΂h7N(1"oz,pw:M@@| s kgl$sjʋmWrN/[_i3!/Z]7y.hBdCaL4qB{z mJ k-2Ԭ;s9޴bjO+T|B+*T@ㆺ@PD"k\Kh"0v~E*p/]*6%L D{Et0V ;j„ҽ:E!8PfPN@ ؐ䘣JE54)8$6+Jf5 ő#yfCiC=ŲbwDzv&cY>D3)aCXxȥ 5 Ijpy4ǸXԞ[aĆBq|f[u;g⼤ N)sFͥ(=Ӧicƾn)i3 \Qz.ƾ\f2%Z1?(=W0CjB&4k,$u7Lb PmVav+`&sP MXȨ4@zOf>KFikXڙqbS8FsɎè6nqq be\(1GY>.518 yf,Xr.XPx~qpuInIT&N7- G5ŻzCݭcyqheqI.!ھHIRotjhJwt4"F.]i{=L&8yܤ +44DTαF. j ϻN02Bl4a5b>reeo/v]f?LC ؙ& P3ӷ(kw3!kZ'~ a5JRZ$kW4Ύnwh.'<~SOyjIR8FEeEyUzJ̬@SלGiZ:@QØj~ij%BK1x.ݨh(nFEKYM뒶FPg>q$gqDVVAZY/`4=\`U\ܹ{l[o}㢥?=n]j MaKA(~m|a5e0͉8]0ouZ+pcOe `/Aux7ڰۏa Lv><;w"jL [D7Fy} mԪal;xߵ맹}zE| qdmZCt+A~Mtg*YNE7|* J3sg}rް3~ d &%EE>6VMsE)VĊ˦BGHq^OM5쥧륜Y<746fz<(V\}u7Zn|ɴWӻ:5I~WU i 0(m1aaxԈ얻Eyp'sƻ8ywK1MBP!D[\j7|i"j@PGu}%bx#L@F]#bs8Rѭo0gP*c 0e(Cc<-ȸb vubo4,j)()e oQñȎE"i)01Pxmc/hd?BKăkO fAa; >Q ["'Hƣٔ˧GGr}PʶDT >O ׋k鿙=&+Wv%aCꉴEEnLjL{mr"}Dɑݍ\>Z UIùʥ}-*ȑ@uDeq7&Ұ8-UAb:+'(jQgE-FlWLQ`')g:ke^vPp^Q ΣSO_muQ)Ay"6F881ˌ)L&"8ڠ+{Ke\'ǒ/Ԕ L cy8NMgcw֋I:3>jwK3^,!z|̠юi>ڙkSc8jk׷)Mݍ]~6P} _>-3k{9UcU?Ǐ{CmdVFPSȽҔJ8&5r ;Gӣ]^Ζ!J^&s>N2;!|zbt؞>=L8&LQѱ'H+=TM䄀42]^ܭ a\0"C-Iek_`m/E4Qs#y>H3"''զZRvN:'4K9M9x}R_mJLaNK!q af^OMךiTd]!TdɊ"xR:=7>nHj!{-Han?hQDwy藯ӻyM˄T*ORڅ>TZ&[L=hw-Mr8Lq#辱=)ɖeY_'dfeVHa E+] RC>_4(`L#RܹuAw|QzΚg;}6$QPQG|"{<8@vB%mWO6]?|)%#)K׮~4AXΧ9(Lz)A!`aC9OD` KD$W#Y9/%lVT MO|쏥ʓ`OZ$#muQ3S76N %΀YO]2¥]Xy"K]u8(NԇVd 4V *o:]؇p]~3#1zfV2BEG{~H}G|'}:5(mVd+9VsZ?|wʻx0'磠\!AKF&i'PQoNvv~P6 ZooBTs0dsʓXؚs FP vjej"_ZgxEƇmqb9JmX[ᦫC`nͥ)#XbWgqqFoVXplYX0߈Ņw/,(}NFUYc_n;l`W'c5R|n*pvоvΧJNj^ib_87X_F =T{:r־-|*t??x秷U{@5cxyYiFLX]<_Ee\gH D=q.z>UmZlՊ-WdV%U y?dk6%D[~׃t:ʻN.&ν)7ϲ_#Ҏt|=f? .E?ͳAU碧N;ZH\6'z#[£#-.qm>i\ȥy1^-]LmɹJvT# z)1[ǫ*%O .~yw|w~?9,{nĴKb j)QbB"7 Lc0!gOXHp0hWl[r0xuN+)&A1Ɓ{R~0%X%ek&(-JYhP£RZJ_C{tI;!;j-JY;F983sJwr$ƓьoVP{{nP:vi؝F#oIm&T7K?b!!g{3)XX,J#qXgzI[t~)wϏ!m(; YY\gۖ8 <ГŴCbFZa]ՆvڪNT'Cұ+"Ϊvd8נSʳv<i;k #rNTEss8#-ڳerUY]wYxxru{ޗ>UwXu9= uOݢun!ڝ ?X?m1)˂rPc3selՏ*\4Xknqr!&dY]0I*zd0(E%oF*"D! GMO/@?_VWhzT{&I.Ff}Di40茙)5&Mj,(gXgq-VYBA H#/53sm<:. }{: % JI!D̟?X΁+R3 ݱސC|T>GV+& =(y6diPh3 Ε:bO`Ӊ؞)!&YVXlCy[,r=0O){rX.C br.ha$bV~XtQK.剫-8 {MlH4(3޸݃FuQEU𫴪^l$1"Rht0{q`Q&_I9& 0XY9\~9灗`%kmɛ&WBywf hI؝xS8se-P12;v$V3 d,uK]U)RcdxPj&~\>|9ֈmzEQtFNk )ΜX)%nX2…FLCM;Ͽ?c?Dc>{ XvӨx02U] uaܝ 憄~Hh]xQy 9|,2 :psAT"?`  '9(ٮ] sxI?i}xEjDyjډWl?O% Z …> MHZ_?:uoκHBQn/HHtiKA>ŴR2d"ab7 5b걆MWXMʽM5Ydj?1EY,5hT}I:^wtȤ>%#,d-ʳtcpG!K. }ToFZ(ZozM>yN©#gAެC>}Z(1 V]Ƞ|bףU~yJdzO*,8cw9HuĨЁwH&ք/[)`T]`k;Y95D%#\P 9>Yox1OITjQF߮c^u '>(@Ó|n5}̧Ijx ) qc#č\.ǎq]H=JF8?Uwy, =-S^r~Yw!ٴĮ:䢛ksltpȻ!;#HFTPJJwt!˲VVS~6zmH㑓ʠRO_'JRoWBQ}lc%fk9 C& %vE%FkfZ<:I8,iTՠd*c vDŘbT;XaZg(bFςX xؔCWhݳc2N?6IB}P<[xӣ^!_^S/|T(0I׺`k]q)Nr%#\谅[t#MO*뮉h{J7R FURYjU/@_i w-q$ Z6ְe !g$QnR3kwT"l1Md*_$a fR1/4;e> !Tk^86QmU)S.e!^EUE+4d%1k1` &MRN5dӲ!>cN.ưAa dh՚hfBD+#D!82"B)vLA;j!j}X}f%.x5@XKFGv cAMIKs`|tF@; T1{H "JϾ|_g:jCLWqjt钲W/]&g V(H~Rv;4H5o!9dc TS FdVG %Zj'Qy#0%fW؈F  _"vw`ib,nʪDvY"Y|PT^ O(1iwBlvGPe 5L~][TL|̛8vWˇ1# SSi1$ 20vA3h9F=?4!,J@Pw%ZIyQ|Ǟ %k%?Ekް'mb3El9zh:>JQ$u?f G!< 9Tv+9G4_=ʃWoxgI$!dpk`*K)&!T%v8Kx%ݳ~5MwUW 6@y=MlJ\rfs#158cW=NJUo*'a~D] |#(`|O/e_oW]\w5/uNi2Fv|_ԶAi +9Z8 6BMtG6oK̯j ĴMn[oI>A"*M՚RF!i_Rg I`#NAiB#PZL|;I$m[t{n1i*j>[! h@tD$I]wtc8v_~* & h+kdM%c1bߞRZ'@ן>z 9/Rhr{-v%#++CIR6U73ҕ9jՅ@dDUBLߒІ^0>VhgRA-u$+(WbPA#la=+謱r1#ugi{IH7B.jo@L hcõdLڤQ)MWfiAk#Ws8q&\Y9Fs79'N8Uڬ^m;W4^h@Nӯ0`]I^hnv‰A٠ǩ.P5A~|V֚7$jL۶{0M"0G2m$ V ncP폩~Kּ\>+ۆU B&ȰNxt8'u ` #dOgҡ$(W)rDNOV[\0tSKoRKuNMj҆RKY\jKɳڦӦ'OJO3YNiӮe %op4H3:pNl0acfZpfy/Fs79keoZM4gcӾ}O_&s/^hqHLi>ߌkixNJ1Xƚk/(505j97}t,8yc`֙yQLwO#`G3H6n7fAˏj+ܙڥׁZ'?'Y&NP].u| h2}COl/?=9uӺ!qT=x#G։)k\J8D uQznV/E!@d'Xq$ZfaNeы+ʡs,z1qՊr0b` 9N [Vσq%;PO ROwZGEjziw&p&:{ʎtB)L$0ʈGR*YIR e,9q]{ m+vj"H~ky}u^s-d ]7khP]9CH)127Jj\Wy|(ʟ^K:QȾGӼe %.g6Ev(>q W+ *(50]--է}+[sDE2_~ CC]|9ˢEgC!EVjJR KLY%(,Ehs`'^reCat09R ʣ0Z <Я;6Iy7%kLsOsB[S7)%/zȂQvgV 9DAm]ݤMVޛ$1iӤC/E\?3f&VF\Xw/I+!aQ7V;9qfekzx9/do{]jӉ#;qBJGU_ؙz *?bv:ܵ}>2|#6{st4 8BS4K}4Kpi޼ /A/8R3 ʂ$'܇,Br?`|Z?Foޜ]/ilޜ.8aoۺIʛB7Ϯ>G7'i^V@ۏA0:;P2|\h9̔$ `'% /U xf @9^݆靫cs12ti^k^'J>+XQ /uV%-*)X$eF!Z$'3؍*A+FIFEL29yOtfæ7}TC\5Ygad3BNX zs`9c8ۏ|0,{YGs9g/['FcuY,͎Vjsν&ԽRyc՝xz)?S DU?;ثgAP?blꋳʂ&~d5&H)Y'z` p*H* ȖbsG( =I8a*OO] dYZpk2&k(jX 4njkҮSXY\{Ï/-mV M (d4p9)٤$gNމ&CmgE(=+X r8ۗZAڽ]m@5&!(4pN%Nvrmgnm`#rk81 up"Pemt[mOծv B] tsz;:38A-Z`r(:aL'9SWr*Un70[;g|A|,+ 8WeV&<'XǪ;Y$ 5׼X鵇hs{1#U»J5Jz6,6N5CS{YS7}Xk$I9N'"qI߾ƫY5"F B!e'gƒ\0>Gr͟)voyQ j`%qKl2BëͿo_`MWt!o_>f_Ikyquݦ!)DJ(C%,` (ڙd4 TuY9쵾)+W)9'.Sj2nkN=VY-AsL4!vBAyPMɂDTxSne[-TXv#ٕ9gep އ^z=SIWCBpMLc9kҗ|ݥono+ت@^ji)`@Q2F/!U,D"CH,`w -]pz%6Pռ,v}u%k>g_W}WW-t)߽g?/ie],/M OkW?Ec)zpQʙ,.ycizn>v+z)pel=}IAIYP&IKg *RH-rE|PsWP]}W\`lb}OLM3]uWN4|33:Eպ,gsxfkF}[ܑze,*>AH c/}m"<>d컚?9!)KU9ՎP5FZҸ I}#D{$3X7{so7MOuGMS>[zjҖ-SzѮ)~zq->LD=-uќ1_{uZ%`)%8F7-rҷ5 g;.wptptu9M $ChS"wt=j! 0;j$è 9jnF[m_G{}} ]_,)n׳׏7̓gk՘ʧWןOEPuz IxW?IWbFf|oRAH:Ȣ,lGh]H.8+ h'MVds9i$ÚIw]T0v" L}t5@|ą=[ёOPنq ,xE|0|f|H(Aޗ,=#OF e~1L1E1PPȒ@{w EEFPX;PzIoWޔh {:X ƒ;R!IޑľOR]YER6zȒ 9Bu]t{k@3?0 (Wz9NOƨ ۄouMۄk2&MhѨ4(_ð=`>=zaFfB]Ns#6h(k=)ήk6)YHuVgMgs+%PYNV+T+17uc[8BTn sN'3ɑ 2ZӟܽQʆ Z(1|S^1ZCð+Oi'E=1!ﻺ-raZ*GWnj}A*!}2@uRG'0A8!)A;ь 8^hGN^oHzh7GtnW4VB sSv' ]\U(% :1wwVHQI@ػ8nLn$U~mUR^m+K>$[IvhFbO$[NUZlȨn;>AS gp/'BftC[e@*t |NpMl  }Y "=YڗU7PA:KTZ!xrX/ q֏%3!_,^dGx'qFj. OxMἢ- [BjGLrF!O :sq*xUx.iDS1Kdjֻ!BBqddp.Yh&`Q *zy} FhQ! lN퇤vY6ffʀ6ZuUFl#k㡘;~V:aZfxyZa<#"EfotfMJYun!$USk8P.տn5(+֩ :v^d>gzuPX&YO ͼ5m7 ['lkBԁ Sb ;Y@a!Zۘ;"/u#[VʢC)JM"Lr2–kHbZ=uQB:K~-w6fktb!zZ/~i k){D^үdTWV~qEI/Z)kio K'g8C5Ar:%tiJeSR{a>-Z=`+@l#VL}'s&'ѝ!vHYӎw4ĖEC&o 0SM^frI;oYA)mzh0-8x( 4u^) #%)q[Fd˘LH%.S.$N̉s ɷ0hA{'ɺ.d:,ɲ 6_@ka5oɐyY>a`mٹɼ1'=B:)\K|2D<=:z[c쮜9Ot|n:nȹ&c"5^k`N^"])IwInZ!rI;4}29e+&h4\j@5/oB!E!&gƨէײ)_ 5'sȒKWn/jbhڽd'({I CWLTHs~f6Q#u}n(#{>K%mq67DY{C}02D(~QD#vNЎ>'+;BT]\6%1(W=Je1K1DH&Tʿ#|+"-NIl5ÆZKcYQk ?jTQbTHAT4tT  {" dl΀I(O!rpT[2;ϛPb F>XYa-' z=H+L;/YGPm;i#XED+u+R)NYX}8అ]1{e^Sr>M%(]+لD-RHIG*9A2ef͊@!RP*^k`jl$ޏCw'?t~ُ/Ϊv`*UwضaTJH-͍Ri.,KKFa.PFײ :ل0 ]!J<. υmMhS,&Ba=.EU@=OUZMPfQ#iNT1a ۧbH~K,7Vjլ;3¹&x[8pk55T%68yAː(ndzsJ~ geI4l/%Qcg7JCSvSюZw֢`#h c9Sz3jBg( u_e-6h~uktk .!tj3iwJ:+tىIAVimYg AO9Gkx y2eR J2ݹ˨-ZS51(mP2c:g\INfb[|qxM'ag)PB@Jb]!-JDI#f'vZJ6T{ĘmaIcU왝̀!JϾ[0%otwdYX!%:/[]:gު+BmZ i4 &Pm(2T{?{ (P ˚wYm~jvařՃyg;樔ْeMNj<;<|s?g_`lQuO|>?zF/flꄷeM]]eԝ.ތ`ޕ.-v9:YTc7G d6 7|Cwj!D* ( Ueae.E~e^~jG& tsE &*;ʗg9+?~2駏!]IΆeǰA${W;<+OIw6iLkuܺpOu:.x*gt:kKRI>vq?̭^eE! J:Pg[pmEsg/B@$.R1Ĵ_Xl%.: ⎗B '”`sTlfG嘌&Cǎy Y6LZ̦-Vv"ԖPF,K ˛u?ED~\_YIP[%&s,5Z5kXVM"ث1}LG;:q=Wl4o^ bD;!z͞&J^G>~= ?lM5PmCs9Q=c^-MWA+vuY^'[jf mRZ̖Fry׭b~8fo?vxX䭿㭁{W ܻjUonǧt)dc bZL"4Wlp"""dAcP袚^?^ޞ.WtZ)О}31yfocsw.K?_/Y+n7/2$ I1 . HHI枮(d9&Xאbgi %V6USoN*VĶK?3,XnW5wFZ>OYc-6zmcy^H'w&-u~*ͽgz㱣?Z}s;*K%/?%__:y^2НP] O>bF ^-6uN%G_>He8<)l/.zcO0yj㹹u>Nlx;>92- oeA4&o2QXzoX]96@ B$oD)*JQaajd#ͭ+:G=u V_G\t#5/v_jdYbjы$Qr"xȡTE̾ ڥl`YX8G]*.h%J?ݥRk8V7y,/]u)l xVvʒd UT s3{7 jKArݥ6R-E7W3^Yys)w_^߿.8bwň?@͒L=Ip1<LM?K8h-Ǚtz8Zv:d(P&S ɴx&ed b h̔ mpkv~ԺL^\r'5b-\oT7bw&݆;X~# /ML`΂<`aHI,ާl}.s1J@GLz?{37=Ͷڠ+Vg*jүIč2D5RnM2t^O3~M1u VQדv97ZЮ7=jfi ; %ZYx {\փe|P /Fwk/7Aa6;륪`L'-xa 7{UG-ۥah+R$#^s?z/M6gc`[ 'lԮ{n.U 7'lE*-C̾i]FFqwhѪD9|s^'%R& 0;ɒ% 'BhݮUѷ EM. xWZ%QBh1Q0T\B־xWr.տٻ7r$W4yzn;/;0xV] |\Ifa(#d0B2u&i^zh}]M0@^bI}5!C,Z 9L{!Diދ!dv&9Jl&l]=ՌcނݡçE]XK.'7W9s:&͂LͪN/~FXMa}o #xmi5+#C͗? |q>^qƥG>h V`b~fM_|V]]E ʕ-*!\Est*γw;MMҺe Iu2Ⱥ 1A Ϻe4Ln4䙫hN yu[ RT'Mې!j~k=,ѺАg[:Euz}XQΗvEuz}TԞ~6^;3SOjf__2$p\=Nd/ۤ<`lnv]j~cKj/ĭ딟SuOé[+2J%n#JĨ|KaGp$nK7WAa#+y/c5i>= P_#=]]OPo4n9^\ja`ـ`YK :^O}M.re-=z%Ja73'"4cUP`NW\J˓-Oҙ'Ri<ɧPtmVO;C制7q6w:htv~"GT`׫4^_'=^Qj}@vxQ͌|+11-b䙡y?人?(C>5U]FB wSmԩ@0)kP8Lf:Y͖oR"_F,Z47Z,6b  <B7~)alpͿjgᵣd6%c8^lYc,gz&iD;/8uR(8heSk`D8"p2j9 l-v/gNfmem@z|fd_jB=gPUcsv"&.H6Àv; L776m /+WA=J_TlC:";9W[͟԰i2D g[u]u}?_*ПYC/ r~R7@5sT\ijr1lrg,v3hŀ^WfRM) \W9R}kzw|՟ 6kH !)!%Æ^/B ^p>1`č!Ko͝gTs[E%wPcߘIL#E=sԭ;x*= ԭ#`^\y3FWR cQ'>||9IJ #A*b[EŖPy$E˦aVD(Y u%J ǙO0Zyi^ xOqU)lM͒xT8)#a^clBؙ*`Ea J<+5 )CZi#m%9Q&c|)eC'!d$DhPM倱NtJNΖǢÇ[uV%(G&Of'U)2kI/2;nX0Q^݇[wTA߹H瓫 X؅D2χh(!AiXjL`)>; 2A际=ک II%Ƌ@7^g pRppQIHv[soPC{m8r8@a_ ~ d2O,>#Ej}bnS10tT%(޽vUk,qPgʘ^Ǫ@F}+Ho3`:lx޼{͌'-ƌ!\jBr.߫E> A)1T鍺sOt6N?q,tߧytjD}o~)Dxf|Qؗ1g+H<rJ_B̬JWyE͚Ip^ ]k5R^y7qIC]xVFU:xW/zv(K#;՘2"S܀vJc=g0?c\Px trVB*{ CE7%;Ake!D6~юP1aW.r!OVfk@5@t 0e\ CtQ(e*K/kAdP^b,:ZĈ$9db'bHY,{J[- e#&!7R"8(-б{ͤp1 5*FJmW{o&U)Uǝ:[u-i<*"!J0 ptp(b}#UetG|3>r GeZ}X9Br,?6p,e/=~}h r3[^t; *FOfF}'(*,?}{tP .FQ_n|뺅ɑP vc` /Fk]+2nخ&=xIF1!49 7~;X˷f},0u;ޡ'N#wtGgeZW4|zw^3Nz%gԬt3I.pT8 <Jmu7g|wՙ(aobC2qnOMFOg5slrld1,)T¯Y/jM}$VVão42lٙsHadvAsgVtfBfp$ gR ݜs-ҌlֱMiBTw4#QP8pFr VtؗoQiעOܵD[k0$"ҀJ2lRpvBֵBf/Djm̯yN`}t! =5;YHm3y F#S]$ On_?A~bHa)Z8́R,B URTa0b #ZG*wB puQFT%@%ʠL"A 70>lGXć!:+!JsfQ^{f b3l0i9#|ֺ lBV9 Oph #m 8S0N/'lO5k08Q G-#~DUP1㸣7"a*M*CC8)߈hgp ChF{ASyPAm2N2lQZ @!}BpB`"LYX<;8c+$T1 S[ n[Lo꾜9{vŸW?|P L=Qe*pv/m1nj_}~j#N_+;t., VVQI*=4O>viRޥ\80,q{@(åqmMg8}0{6mm z`LQ0 X_* =ۛy;l^(] !9צjxQ0 83l!ًo&x6zJ(DPV4fDW<5ATG֧^ton@b]&kSn.s0hl RP!ժw!vYǐ<Bm|>_UsζŒGs[EwbRUۺ~W+J:AfW,V#; q {4-=C+pwd8pțo}^ճer yK4jM19xWS+Q =JX,1B[Ciz->S5BXS!<Iӧ/9Fۙa*Dw[t yKU쳳| )$A`&,(ɨXF\u"ɑ16񱧜Xh\F:1V[lGp!/Fk= c&P$O‘<֤W2>f5(EYiK.єӔNHsllFMRT(?H|c[^/*ч{-! ؊` Is{x.))f"d3эa,P/t)FEs1O&1Gpz֓SYHdgi]=]?]zk\ *W ' fwMh$<("D0D%"sd:,)ܓ9܅*E^ir b&l'p*7 U}[7h2sbk]-]@՞ \`H6c3h@5rT`Ip} PA+HA<ʒLH11UR]|<Pϓ+u J]!Wn;Nf͢te@pvrp1v~j4wc՘'7dW' p]˒MIuM 8b-A|X.]"+%\UjrNbNG~զflr} w4+%[rgzdE /޴a̟,{ҔV4! S&+\pa:Upyq{': JjKDj)L Ype1Uؘs兆]Z)FLgOWî>QFQp El&%$Z,r5gpdȤb?dOtA5y͐ɕysrF󌃮- ٌ9sz֐Br^vjI3FOǫ^gNG}:a,v]fSS04LP6h8/8=۹2@H4ɀSs(??= | iET5m&Y]yJ g0[ƋG2)kO,!~vA^W.7d}⮬sP/~v٫kGG+QwکC9(`~s ;ӝz߃+}:r-hpDmndt3QLX>GLJ0hV0v1ȧ{n ]h (*ҡy3M4c瞧i.SK弐<̅hBI2-sKX*mRH8KKV]Q=9R*_ an_eY< (yn 7SQ\xZs8٠&xN(qUnjV1]jnHa`Y\-#f&8iqS75&bӇ21}{ޚLi:3^ƷAȵAbED mB9S8%nuE$l1ɦc|^S*+&Kp-Kr߁R0VԚBLD2c-*92^6ѕZJ{T1J9䅴WTśkN-/`w#)( {x -b)PtTQ;T"g;="Y1&D_W0QI0u\ߠ2xàe{)$kX "˹Lfd-Y;1\S,vc|ֹWmB3ŕbcgF"k$16L\jSTP Mtf~RS.ʴD&7s8uy-I!tߖd)&,eEk0^_)bJ9pݠ]˽:]V[UY?=*VYvûJn DK-u!7op)M\nnup IU&obd'rOdv{30#RBn?=tgv|P*MumgG]].M#n M8|>X0苫k|inY KĹOhAP&Kq.VAB%t)⥗#ZBY\{~Jb6MY'nm7uh!.4|;tcxpwEFzp9 S-7@x Ըj 7L0"~:= 1ח/Kd4R_" Q)(VknAwж[ ]߯kzHNpPJ 1FHx 8 ZeΩ-[W]M[Zrj_qsVV D.fr<{ uCu>I2B&\l9d3zH/ &G>vo.u7 dj21&4OT~c$HC5Yt`mdWӂOGx 4w/@w&~E+䵠=cĴ8a*uk NnYmv"7#|' 1ZΩփ˵>C)C*g6@Bbr2ICz򟘤kp zseHШ>nP7b:IfKc$fmwk[~_]7]© E&<\\(qN &ׅR"r*I]p:/%2W_WL+v.G]~)NG 1 b)GEe9hT2Y~M/rᣃ"n I0'[2V >kLE&cVb%`mL"+8SԨ`0 ϞVzVD#ɹ^؋.0JM^[ړzL$&vmF5gO6r9u5~q%MЀ TdGYuOC+h{8 .>3@WOuþ+2 AJvs5Wuw`5l`!Rjnw0Ԥ}AZKv\é81*qA̩z:H ::oġU.IyJr5(bո:REMWӌJ̝nEE\Sjϔܒv^ zWYu`LT6e侌yJEK}Sf\떃"jш)IY!$*R-hW%8nJqٮEY RmWԄ~'IaԍBI}0 d +JҔhEE m9ڻüۄ# k@a&)ti\f/, 4^oaMuj= 1#vrvާApjqvsmϛ=p̯=G҄6}* sdϊ- FtygQx?,C$Gt`dl ="j;Si!^Y $U k$Zɲː!"Wä- FPJ3nB4&c)".[88Ag#!cL*(Xx,AJ)m& "+!46vU-Pog2VEnL`\^|vu^(AqJ1֚\”>MScKy, k"9#]&E:aE5S 顝l#ǐKS*&*1֚dGWgA wCL^~xvE8`D#{`Ӌw،D1qN[ڹ5msBwBȥIX _mɓEچ f <4&gKsy4"ڑT7ŗ6/o TX%VٿA6 ,~Y{($UY+fn69)kH:}ʞ~PnXXx.^d_پ:ZelC&v#!w:ݩ^ _uÝ-_=cpbBVUY.|㏣q=p!Ĉ.hW/Ao ApYJm] pl 3LȖe!1H9/,].c V2 OEvC`\?ߥ[|khE/[2A|#?2T nf伎x!w/ߗ~ZY>}[/L;gg?O?N2yx1 T\ꓬo{\SltÖ-K"P~4?Uvƈ5~+t'4^:δs݀\` u!Vau*Ү8?F엢9C>;6/u kG[U!3RMױi;DAATk[*'%*BmBWAkfP0*` -K}qVf^I\ Z;Pރ_w v^/]W>6JOc]uNG{7V~VǢjΣxO Ϗ}~~0?P(n&?wjMݺڪj:ݑm]]Yw`~iQ47/ùR?ث6(f[\m?PA+d7?7Lh2lDtw =;鳂'FqovG2l_鬫jr-L_nڶlU]w]ӕMK-aMC]U~dYS~l)땟Tׄ!k@ \9}ugW2P &L~3LI.D[T4ad쉸;^9Cu_eC/(Maw7hVJZwF GѾ,;qՆu[xgjuv #b6n`h3Y1$gmc *<\b__b|5ϭ|VXV[X#q}~(lŃzS@͚K)iAmqG{j;"7`UGMދa`ѧՃ)y4UۡNϣ z޻IN= i6L]aj1ͅmύ9Vr +7m~ rDbUnژD) Q0)6M 4O|qgQ~&ew⢂J*1ٻαɐ _O=`[T[C23x/>gN-"-5g:ctųaSbsQUc4Q-8Ap!S|z)J~qe-bA, R-Ӌw،TDdf@^y##C.rNYrԌw}oI/tVK"g1H@XGFFɮXq;۶/ruwf Zi֪0-USlZѠZ,޽{S~}?}{П7x֚TwxwšpMhKE[Ad gWקOp<ϩ+ǚȀhGnI5ceks4٪,F;Pe~&KbLc v)%9Δ(-aWlTo!<,}t)}NApI!N~i8|7#xryH HN i\H:%(_^\O ~l/1Z}bK"LBڨD/-n֜N 2= v<yU,8)^DanS:A +9]qAjq.EQYh[Vj-jҽj%IA~WtVڱ6P{7o[DomshʸFwt- :ĨݓqnxҗJvwS{)՗l|^zӻ^ wWޭAeFɗ)}DzZ.Sts]sU6w]wE?YOz {_oԝ6C?A|jv#WԡB)}n?n}}9GߍR }oMUZҷ P/aUWFRj,}#WpdhZФ+sT iD#4PS&r/hķ05E$k"\i9Ԭa^\)~ :kgb8[GRW!Uϫ?ջI fi?#Hhݢ?Sqa||-!|=cڔ;O&MB j& ( jw5sqrXCloR#n1rbݗ({qx8 PA򐦉3e^aS` c$w앾B5O5J;\B诋uY n%(ZOd܇{TO/'TQlaJ:'= ̂N_tj?}^Gqr+Э0~ț,يO/ 0@XHKM gSuyV1\=.}>PN'ۈ\JEX8 Z5R.h|P”!C4=6gJpfvAH>mjd_"i:Vjh*!մ]aΕZȩJrQV 3gFSWZa˝yNխnԈeWj꒩#I>wHHkU78f<%UaV  Yꌣ85f|r %!Qne'`F uS)ê7b e4EeLXMsc('XrgB|!63zb=<3!/k,}Ӌw،$(DQMm0ydYntDcrhKCiS7#1qtӨ¬~'15JIκg C!WH$:gVNns¯aN:DT_>2@zxp}k7AUEVe^:SpaP8DKv73w9;sBt%ȁ"k̂i7*P"mb&|ƚ@'kb29]E.iJ Ѫ,T!ƵEg*,WcC8{19p :){nw} >Q~X.v7A9+eoi䋂h[T#/z`deOO|#cZ 0j2%FOk\5XtKO۹.1j@_}a&K;no=npM oWiߙ`ԇixnۯi/OӡN>x3Idp:#:e!M,AWڪ*ؘn+[+[;nF XUV;xޫ05+~3/Q#h+6N N)]iUQU-܁//QQƛ<1>t3Y;v%z_^0G[fqfW} e?>!NM3tFi'{Z!KJĝ_3eC/eHVp$(+P]vxv\Hܵ# @PzjQB?c1puz3&-"BCE;) TW㗷Bn C|t 9,esMg9(؁/~x|7^?D@ ?fHDB&D_d` .Q[p#q+<N3rnrvd .ڈ#Ź"Yv;sgF;vrڳ-+یM|rN Y9OY)B#.]C-[{=8 ۑGJp8VP~y;c|B:wƎZT[4NgfD3*E|boӢLF2a [V''2du;~XY(½ *A%.ppTr];]kh#=8EpcFҬOG>nowp=7yIz A8" МAbƯqΖM|m^0te{(R*b'"_#rb"< oðX (\&KȚyffe8 v5*&:mdaBp~#$CԁƎkjJ #XQQlA95e[pZYuTX+ iZdRij-벆TYreKN7h/T\)dV0"b*Z{Pi|lm}CAZ.5dE-q(86Zi哋cm׼%'|짜YK:vn`Wz8nrMnGq!^y]v`~sWFf YPzjEJ9R#qP4k9|_眶 )#vϲ&lc7s+[LP>-DA nAްy09Qj\.?>G1895`KʎEri&VrXʎ@i3 XmO#$$6j]0߬H@?\;28/`nK6y^TDܾ:1{wSeH|'//tӸ7o wmHWzŦKV@=VysyA!/*M۲[=3ߠSR"mK,8$# ZjW|WفGL!uk+ډwgv}L9uFv蕢z0{"$": 5S,!b N ; yќ F.=-e[bA" a( r~;np&h\ V.&<\3Jx?`oU\Vʺ3[jZ\JdA-;:Il{SwF!pCUVYڵ y~7!~#:d9wÕRοʞoi [ =Y;o Cy.A4qp[ o]= b ѷ3#`EbK)\>E d'/DrDnvGMQ|%W(Osow<4 q\7:8p{5w.c{fGzX .CN:6ưֱ  -ykqtX CUC^5؎qa\'[z 2Ա@#gUsbcLrgX`u'9S 5``;8Cgpi vf1% Ogiyc0d]iICB(H3'WjJ=i1F{y:㼳SJr_Cͪѷ(<9BM4uJ B<&4oGy_#<(Aƣ|a'I#𼨃c(:Nb`$ԦNԌʉtAO0AyK*,%x(0~cT SpQcLg夑SKZJ',X&ZݗT2 tu֜W:èQ<m@EzVzvyW.PUdmϕ8jQ|x}1]1bB}uh(|&d̴OUs鬼:Y}d_>|ح7&<ά7$M>˿5%H~D'FT"EX2~/.E ף7f]U_?hq-/3'ȋ$4({n"A ؕZ^R5is>)}FgamĕN(ݽSN>*!72lx1ރEWckcEmbZ\Un>}׬.o__v..lSSQtoU`YCgHR s6{$Χǜ^W~\o _J 2 HM34jU[NMQ}OcfO{K+J)c!/KC!8Wá=71xDniAY#Ҏ+lOA42q9= yAqT&gQT{8H#MM\8q''NCWj:g(iF>1}2lpz`WY=/str13fJ ]U-L'i, ENYN[V8ii2e'BVamKҭaY_υ PR^Va&$ť:_|u,u^@- N‚FK)Xf\eB+E󬥍o6٧]<āk5Tg4 H1Y"ROs]}2>Lȭo3D2ֶdܘeŋJC*E )H+`{j5BR-d rȹ(6TU(*9.F)H Jԥ'*z?q) !4O\V#O\[`UJ; (CKWH><DozE4T>{E5S؁Ѳj)4fοlFc 4XJ׋z$M5+Mber/]{,q+ָ4,,gg)?;KYMKyC9fP r\8\ּ[{V2"gV*a+]`%S5`ꋿ|'XIYj~ѵ.~qO]ޣ[R9fJl 6/_m}/0h v1Z!o[)wJubzdGtJk&\ڜ1i WJ))eE%/f95.6m=11ܼ~}fw/8k7F4}ef̸{LPV{w\hssө']1|}wz %ʽ/ľ7Ի讀:s,=_WF^t&տK7P+=ޗ1T"LrαRvwXA 6:w0?]޴)7eM:}iSbR whI﵍,! #n=zY H SQ+t]qi@W8Fazw!o0TB@_0-^QkI Y݈g2RWϱ n^~cO6)yK 7M ԺsIfCkYӚVA$8ZIb}H!5JvN1*pX<޹+.5qޭb=EG˟ݠJLdU_W1ُ|ա8knV! \W+`أ?}jLy>?h1D֙Dvof}ͮhyiU:<}w(5Yw3>=5-mw󻯫;y6ڢA" Tg:?5O'C>/Ly;o>L٦(왚\et<ء.%,7wѫ!dr̓T97.&#mEu{zmm78#mø0bwtitZS-p CWHFUJ3r}rD)(:TDCYkɫPH{ZJLPQì|N9OQ͡6Q4?uGnͤ|IRW,֤TW FVʸ)Pm6B(JP w~X5TJ H#z$5I^xᶝQ.T{r^/^oz$^zVL 3\*6JadfMHTkNı+g{ʂ^q:1FC!XTĄl< ĉ47V&=A<!G4z'}xx)'iϾ艐N%'xm&B)N{={;gi-rRWhN8ޔV8%) [(":S+j^:{P2K4L9)0Q%gnFUE"5>QvPMC3yRޗ2?5Yڙ'#Piy!v6XE79uE(ICR_JQGg{&f j(e^p|Z\q @**/!@ TF)M]=sAor˻XRV o9~V/HX#ꗈ˯K!?:GUn5=/n&<7w_j:T2WU,n 2+q$P;5۟X[T:.rܼ.v38XdуIx ji5enM F-)xDKA͋'ƹp Q9{a"uY㟋 \I}Mx/">j'ze-KP؜h΋fv,+((c\ J5Ph(0(#dǒ^;VY6CJz:|v--KjRV‚4E[KafdIUU ʍVy.,>ce➌\N4J-fc&btM)-ǿR:ǘT ̹5A*5F9D\Mҭ/m6*SJ PJ* r@[TJe.uI(~ĥ$PvP!i# {*vpPXK_/@a!$ՎԆcƳ}v M&^SFߡ`wݪ\ûW_/˞-BS"S/2Ǐ j?=@N3@O_&¦xҦ<p9I;-=Ճ^ׁv,ucZb/ lfzJmWE:thw`9f//;{ulx@o 'wCѠ!" vQy3wޤfMh45x8jTŵ<=!Tۈ{>D=Ah rl?NmoNӃ_8uUUjo2<78 ?#δA0V@%(;֠fT-eA5/ϛ`&>88-P㴾ρu^Ñd'ۧ0GPOx[ >]jIq;YדRkw-qH8|K#ԣNli#:@eQ-5?@UIb^`dQ;;l&3K$@"sovWJF\т;q}˫'1&W,5eTGe[j}LDXj)Ɗe#)mth}J>}/q%GcqSa'K0oroN~ /|JMTBeb{|p>V@n3$ 9j_E!Z&.S*M6f3FTĸN7De_SgtK/n9<䃻hOg[ H[ b\']9W@p-rFdt!E7} #!1"}oV[ Xg]:lxΎ:F_>_+Br0 L[ A{E R~ uuռj?^J^h,l]Ȣ,-רN, 6~{e$ȾZ W~([~ȁ`@&ѕ&Ej:e!0`-SieG*iFQ瞗SҐ .>׃w(FA#$=LMiV # ѪdeDžbbɍkm-pԵ+Xݺpն߳mTP L):l锩B [9jJ?APK[rFl>ڥX{W(;5tW:b'B,mu!Dη 9q? )аwTJB,ҳBБ0m˨qYAܶh y}0*}^̗ū'@icwᗷqv<Ěq2XϪ3,hr zW`|x=zR~.On.QC\^?˥#rBݕN/ݕVOn1tWꈨj ѓ^7mdV$TfBN]zt6L7ptVi$1x1,t^`8KlKYa.y{)kd=dfK4ƾ޶tx|VH.k4ycL͔I鏛cJ(`JKJM,hkZ:'4P#*s#x+\*'_ykbÅq\=,eKjM@)xT',aIS(rUe% % 9-M]EeH]Y]3UTr )!CB@%(GXR wIz]mQp+x'ss[oZyS MVmMTFR5Z`5 6EG*)Jo o)}y?%ǷYǯws4ό ;=-(ƘM{b Ϝxe۾\{H~glQG,_s}&0aödb۞oq [ȑx^BӧӃ[LLi%pp_z!W20 ?G%Gjn2-W;c{HNŘ}Y{8i&Mg;WUVmͤ^]m^-f`ιAӧy$IwR:"V3kg*1R ԋoeR)VRk+ ah<"(Ol]3fK%hMڥ('P 2^#d y)]qB=1N7Ԯ:|CmjUCH&iIkG a+=jiX5 O:m6#PG|5H(7atTe|Y͋F)?rBckgT/>6ߘ;G{ѫ܅{.orcp_X.)̫smP!oX-+R#? :wh16?nK uq׀YyK Z;]k"cHJMTB.ڠH\oĈw٣-OB=s}58G3{xAg?7)GDa C"Iŀs 5P>| %wrXIk{A$/?t'9',L+By_4ӲilP+3i#D}_pV?7isD+N`ɈٕLhܐ+Ъ'ȧuH $7gziq8#V==³Bft5.pNT9oZG*g)KY$'*[rrKPbJ_P^5aXexLs) k BTܖB JWVXou-crF4zYY\jx*Y"UQ%CT]r9( l =щ&[{7BƉj7([D4M [_63{Aw=<ަ$ACCc9HøJiq-;fiܝ/Uv?D#CGQt?$e[j5 >pF55(,(qj% )i'#ϳ5!Fl_x'xfſn[ӎR7h"s_|BOB6|m}X$ J*r~CX.޹ސt-M^gBf@ G z9Ek,e/ tRP!;Ǭؗ\{_vP%ZFa~C{>9 -Q!4R :#~1x&^`SaT#+ɘrWAKE 8Sנ%r# q[ 9 N)4e ZQuLW"fRA=J \5ZF着!UYH5R)+NYVi+%\bƐlߦ+mvL6gۏxh }Ԟ UaR)jaڍKO\ ǶrVVO[DžucVVbda[(((7|ҥ=r'xG]83.F 9Y('#N1RQImO)ޏK߷VhDE]Q ۽kQ %Ȳ@KF-ѕcް+"WO5#8Z;,e[jp:~Gk42S5(gUu{ٝLշX-% TQ¡z9+x7qKSlEJxCwLjM'5ٱsۮg5F76P7 GiWM{c0B,DTT!։39N|upJOJw:!*`^?Wj>6(G \A #NpD <:!ƼlcpC{ Ę^>$rMԨ _~Dg`:{͠[#݊lGt71"njHFXmJ}Z [#:V08YMVq}fYގz{- a[%fck)'#3 NnUta)OCVm%%n͝K,8%/E.M'ϾV2jvv"/66ݺU3Pu9_:pIze ߱o)) iP =_[5;}wi)h4JVm`k~>i\6 ~ߘ;W-W}'o";֫!9~t}-tr~)jxLQnldٷ3/C]?{*]ءyR.D[|͏EEͤ׬%_IQf>I_X#]¤?)]eI (ѯGHX]s!Sg6bӕ8/Y0Q pC: nQHFݏ"F< F3ZϪ|ᡯ| da#)((EO 8=Kqy!C>Ok/i! $d'Uߠ@0OKmhBslyfb)MG=fiHHp yMKPIjt,S `}XOkܸ Ԋl)&U3FC ͘eל43oo 3 kW֦FH`)ҢuԸK^YW{Vո4&U(ޡI"yYUZ1RK9%-eR& hz<+H%"<Ťyc@G}wP3?)p:E42vd^'@3ؤfϤ]`}*ɩOsއЮXnAq|kpA SmOR2ցcRN"VgElG-F2ԚL|oe2DOT<iwG[qpFcTS%sąn*m`d2`$7m#Ch`$!Uc3$ژ|+bǪVrky;#hn`.Qnho98J.V jX/$SKObnf.MZ$Y%kWT*g&#d80 TwdԠ>+.L㷲>%b7ҁ*)t!i }0: #0yu$l$>={yޠ xK`t"|2NWy}=ӕ6)]C R%/5(&dXVTQ3b+H;Œlr$oÈWO}ƅuϨcO;`!MEvfD Ѩy3N̥%N7U$t 4eşwjkiA`Nw"R`1cDKNRQJȂ; lRuN7 ;tU[xn&Tv~ei]kF@ KFc*SFP&iᄢFt>a|{sܚn11]cōh!a/[j :u\*lWޚr}!$_3(yo,OցvħĒ'M(%\UhV)a^361ٹܷ{i C1xv rs j¹_O_Bt2J| u<@Y+6H eoM=gz_e;jd4Ogv1Y<5]Hv '2RSyHU%J)&}A2":yG4Qjhρ,'6Az=.[Rz6a9by Q)CqG0D'XJ ||C&&?! ϡl , WEm`nA).b<)ʆs FJ(MiAUHfTV0Ixnd4"B3-z?/?Mv>׫.+X/^~V&d"?C}]-׹ e:2%aS[7g(̓y?e7Q՟0P=La~07w(cFy9z? hGOT-٫Pq(frX@]*] P dzwqLs5ER?\=pMΖ%P,YEy!'cyKc"jmtp/P>RcV8om8^Zx-.y9Afli`TFQ6C*ňebJjuFHE0(+zG}8n-kX"M^Ih{Fz\Mn+4vwEJ)7]JAWWT-Pr K)}|Bt%;Y-O frʐJ1 ɠ a B)#+ Cj?Mf*};zp:)X"A *GJ@j,қ䉏F!gxTT \;uK@5!Ze}j~3޲܇55 Cj~k47ӼkƔP y?T$H:5G6v|vҔFVChqgeg ?0; 콇TcOO,SeaR^\jć \G_YGOnt8ڈQTԌg66\n>78~F2$6Ckp߫h xo hW`Rkb* VAJ_FlX D G0W ^-[$KBVk:2 ݦ׼mmèV8T0 @L,g^: s\Tԑ6^Zs"jr..5JّRE"y<`!DD-K; $-,L2]b,94*$(q>}\>}N ;=ό 3d d+_: 2$/Ȭz0vv %]]MS*OZ~ k˦qagqsP͏/iYm^3FrOW{e^0%G j"+DPTA]eMH,%"hK0dKD/ABjIHQHj/\Ufz*菫kͥ-r _5{OFW]0ƳkԑV.1!{|~,c)KXe ~:+RK6F՘Dz6 B2Fk).&h:ϯ hUmXwV}8h0-c6W?M$~|LcFz{V2fe%Dq-)kIfOCfnʝ=!6DGLuzocD ģV*M Wx֙@  h2ޓ%o1)[iᙽc۪@Hj B{j9x/BH&LQJ8if6@S2%-/2Acqgpk>]<(iv/ϖkn9 O]<ǃ˳xz4.@Yt\fL NQe&8r̤Ds˨vU]+ ]L{~iP4% `>ltXKBn_UfLXe48h:pQB HV@$*}>h=]A8(9ed5jb-R3o5P8H D8_g+G+)q>:m׆+Jfʴ.pK %h7`A e7 _J&N""eEzb_Espk{xKd1hCߎnvgYK^{9'&ԢEW3πr.S~88][&FJ23x%&ۙxzpvw#gzFG*\;Wl\`騄P{Ɔs:388'vW^.W+jv0 %ۣt>12Rw~=Op@E4l Y=Mz7ZܘWv(ǬSDB.٠-$EY:1ΣyژRr"@N hPIj4E*Rn?w7bG2Y~0i1ن-`5 g4TʆV Zq ג^dh[F1B~8`'>^TY&:'VXv2wmocbׂk {;XFL#Y:ZɆ$}R$Ϭkg] Eڪ|-3HGasB6T"PvUwu殅mnkޛ L>-폾6=UNj/ hlBpapc N Ni[8NT!dҨ=q,CXnyfŒi(Cѹ}oO,NMl؀tkGd7>v^pr`Cz&iiy}<`y[f[Iρm: M%YlgOI 73qk8f omEl:Wwmlɗm2y6deia;RL&afl-{*mѭNKIe=P{ AX6/*R 4/ #I=Y@~1 !iݯXZ J{ =C*%bj K ԂKGZMw TmdǗ^tY3AǬ]|CX>XgC1DoOF)F F{J9l0J/.a(9k/+E׻P?%,0S:_ /Ra1;Zɼ8OJVJ/A$'`pi~15xK0< BPeU[ br(@EdR"z˜st:Dft҆8=34hOށoU`μҝ`jl#Z9AIDG˸XshS/:U ̙0(]d @WkDz8ڭq>eiEײ,N.p^>= S _?1M̀X}ϯX5(49UM0GD|=>C*/ 3p5Jl]KI on>ch]l϶g C72 Wѫݒ_nor`CzL) [1AIT7;#eF oWnxVs'Lb h7J tԂWF%2A1J"1KAX QIzMߋekW78>ٮH/_7}IOS%p;E'JkznYlEr2BT@]s*m }G }5;5d)0ÑUο|g7EL*Ho"v]}*|;fOnV.Y3fLo\<wYIn",1&0Lu|/} SLϋ_EF1ڙ(Ejl)_NCwx~mz J+c\V /a4σ@ P2ʱ,s׎r_p !9,zNa*#q'X &Q O\ z9{5OILlE?rkJ3x.f eԷ߭VOt _+~0Pm^{5LR1bDl9~ـY'>:a%dMm=)+ K',,);ʟ ݝu"]d4L8. "- HTL{"8nֽ};tJL46$kbY`}BBӓi{;$D"ug;?NC?eV54T)fn ޝ}a|NXq^ R}҆[ V7t\'G7TQ :,A^q.PՎ4<2D%uFye0ղOz8H^=B|pU`NҪ58ⵁ) Uav<+ 9*/A(!aյ}!4)Қ6Ts:1_T:o,\\D,պ˂jM킼?[ s*3}p l]k]j @qjv)9ATbYչl`bw/.gL 5u}[oJ>黻;wKqVjCw['ޭ+6**gak헰Љ#Nj{+3e%krncͶd##)H@"TxHW ߔ>Da) '̽s3Gw.C_:Nj؆VteNܧT|D"TPfEy\Go8o7Cj>*qd,Im$SZlNX6 K/!%v%(+0&SYD^^pf˘dQ];KfhP,올h k6`By\t(u߿Nx1t!Oz.U,Yi(*Cd)wN;~< S׊vR3=\H"DWoG07wE@DS2 |^<>';tx+cT|߿{qgwwznB6A7eeF_ q i.0ftaǞt]B=O ;6'D)[n֚u_E7%DxS7K;aR][ZhH0`NɭK +&4I GHh/'B$y@rA*q=D)`ݭ'T`C iEL=Hr@nЧBGP`/Qq3@|[j Cɀ3 i³.4_Đ@؍6̮--nz- Lȕ\*#ո]ea;r)ܑS +En"i CbX^ũs)"V 8]z @d{fNeuImWg8YͩRFf+6c&(@0ك% fʇggC#|Hy9G#x8u4y¡PGM5;ZJRxkȧLjutV"3oTsFM-F7j* m];!60lWAa6SQhd7s0KQU;*6ӱæwJFUw*<ٖreGvo$<Ѩ/(TL̻51*J % -VqV0Xѕ6/r 27* !֦Os A@)>1+Վ) @tja )w;ךVu5SS`J6~mTj*ZM!V@"kmT!BpXr3Z.M5E yj[ LnO۾̂)]{ /W!!W.V2UFlfs&K :XPs.Skd=<Ӵ9Sr"j#S{ox-j7[ LDE[Qd7ݬjݪ+]Šs ,k rj Mp.IBd]n;SYRLojL!/@p]rL0fA* <@`҈B  Gt% =?pI(=a%՜y((С#Au< CC Ca%y"ZcEcOzOZ2tdOɔ R/DSb@*9Y׻I:rǎN0 W|4;wckw^d$L~ˣS=2WUp4i+h<*k:=i{ 3aDZQ3أncO1)3?;_>FQ qJYd8TO㫥pVoN˼`ˋ~nN N3ݟR䏾s j͵X<_7*<|wu ,geCנL:v5*b\t7?6η䭍3FLpO7ν|2*h^\LyIMwԽ϶C8 Rn2gUϴG/og R%Sw?ciLji^ A뿠_Qۚ|bUdK6 5ƙDgTP3L3PCztI]NH~qm_8vZvzui݊dL.wwe؏FRMPֈRE9VR?D+q-ԊuX9MdԊکS(QfL7IjEI+>B kT뺑}KoՊnN:è&oԆ$c|_PB",wICϋ< <>0> |l71P5,?n[pN3S}+,3}o}+[QZfhÍP$]k% Y$wx؝OkYCZ~V+PJTVQj̸:)¡]"ܓy)%#dFA8_Da&rr˛{ORA3sAm%I#koTIrC nAFH \DB7KCr9RaG / Pʕ׋" 3$l/[K+lE5Lƽ)sTI)D;!Ǩ~ܧZȌIYJ)1R2g T3LJ#O\ 9Q˩vǜ'q,226[HIHgS2~rXNf.zt,pͧ>;&dj`KKad^K$o%%8K?m!nsYD[2g6`tEe2v܇.ǹ.3=rU"GI sϤ{u <,:Xm7YTS?xSd> Vs\ݒ[@sFJbVsC@,P3{ョO#wn8EsQbOW8[wU1$r"xX3z-$ LPZmV"7 x3TSEGj5bfO~<:6lޣ ]B1% yj9M~ gUgfH}EZvk؋;sFuk9Ur(݀q;q yx=_cjUɃVayI-LfC9'NFxO@HK9oǂ}ͰXQ",9ToKH;0jTIB JTzZ B.|g2+{sd{k\Y21yL )A~φG+uYr?&7(Wldw|&٬|Xwz<#%B:w1s.&5NLr'غ0MHJ4g10n@,A V:sKt hWfp(l& DbuT3=ykq39}L5>mĉ>|x6 aLxǙ/!d(&awo*L/(~m|s$>CfI\Hx&|72yw(s :}fqOF9>v$ơvg7j.ìLO('"OxޜUPXQQ^0ANb8鎘 1\|vٖ.پeNOAcJ, *-of!kc-IvtDQM2b˺|[+3 8_VKJ`Ŝ-y3&̼,..=%H mř?[M?]RRg: ܺɶZ\z׀-qBYaɹM\ls&6+zŝԙKʣ›w}ijCf .X' "@v]v0sfLVu)d3mA5`]Q߭[i]!ޭc΂t&qT˗P0Hi/\Tgߔ.;HR/pN.T Q3#[ۊéHvDr{* D]+V v~hj|5Tkȩ37.&@ؙ0Б2 ܅4ԵZkn.IGPGNJAʥi Ƥa![`K [UWP$$ۭ u <3r&[.ND֝ KKA՗of ̓[g#4sy4({swx z1;yQYAwo'^=7 IXzvlZ2W@WF"mΎ")y}_ TB3ԍp:(m.B:Xw fLP]h*9?BÏ|lUIg\B7*F5i}P+r!8򫠈LFJf3%k3_2!ڷEx܋=-V>R/ȂFFcKn!-qin)W HW:!$ǍYA/>yX F>6Q@+,N7^&_ƿŏ8I:I3e9OC r[1c$R>iK@V dSL\ Y,(vgb=!2XT5@LQUӒ֌K2v?Se8WԸ~+jbyx9(ui,dVXPt%_'UJ3xKks{If.Ew2[Fh+r k a~F]kr @FXvƃ3N|aGu Zwt U=J]Ƀ#cZpŭ!"u;D3^?2cjĈn MnӰ{~KgN4nc*Q3DނW#Cˠv5Ivπ&Z~nӇ,tckPJЂl^71!r,Zx]3 AI(y+j2Y;uDPQ6J8Zr;S~@kE_tHU0X&RPmd²E5Ke"oz5,..ێxeh331#:/t!ɯD.}<^FsxzN_w 𗪩{7.B% $$QRuy3&ҟ: ?'NE/(BƘ66\r(v.yWYl^"`ژy\<-_TL+Y3:+71ebmƕ.⤨g'ABy9ŧ܅w'"ߞ [.N%C3U}_*pԉ+eR*◭rz(lsWu0%& D1Klb7|;3͜woӊWfjoz>A9̯C!s0=3Id<6*~_W8U8 0D1I5SI"K'8#2 +J>™?|A}0n({`qjGozf:`ۃ?Uznޛ5?M}7.CX ʝ?^)ߢSWV~#җm"DRg%/M%e\|GСƵ/*Dv-Ef:8niZPr(_Zk_Vr(/{+DϽ\= 0ԉPo%c'3iKFzE'ո |8>bݑueת7b_%{'c{a1LP훚gy(+I\/7F0-aᆷ/1x>VB7Qz͑ ƭ/D!u+y%V`ťR)`"9ͣ7\~x7 k ia0# JHKijSLCmMBhrD#sDcUY8tn*\3߮cn0"XN+2Y>f />(cs >ջo\B?W̛xڄ_B/?p<.)8@oCO[3m8Nn߹_l4烀ER Oo"A30d?F7cQ\JB8kNmb= .=z|葦f2&( ,S=X( Dz0.,y<$: 5\PU5\PU>Ԑ??FXLbq^#[ʱGXp+3~92k}*>P ?XtW[CgsdvrïFM $>ήůwhas;'P(m:Nnb11JJ(mΕ w{&O>/Z\(O_ deHN:5/AɫIvbqWr-T(wZQs¯|1јֈ`?XOɨ $e;(ε%=" Slj?*YGJǼ,p2p-ȃMҟ&ObT,k?vYE^/-7:qFԽ}_rAm6qvz~ʐ#ڑv+`]j&h@\]Lp^꿾_jeo%~tuy~mld0ʝac8O45)(]sI9"Q8lS'ֿV"JѪDu~Q)oy0D~[n*m]DQpB]Adw%W+uܬ%gَM{arj{~|׻Oú7yt%}Hñ: %9_\㥤D5)~3ۯ '6Qe#N4s2&-\X)f. ;)E>M=5Ϗ c;cWlNJZ࣑Qq#Q9i! ;Awbu;wE=C(-]xqBw ɮ)a,ɋ?{ 0uߑpq^v ߌp1;Jօx$0bqUx7O_qiWAӮ]MkZJ#U2ΤLa&p+31j3pbX ltƪUX՗@{ϩF[U"fIEJh,q-KϿd@ Y5ӂ6d=7y$En&Z,iwwM $"UV)iR2uyeCħizZB[RӍ пߏd.;E!aa~72|xx4ɟsRø <9ZH[unNލ&͏އ=Qr}ߡ X9Aeg-lzNxo.V^ufS?K¨׊&e0*M{"෼Zs_{HZʸ‚7*Lg*xGaκ,ZCsЗP\}bծ䩜쒊3]`!uS7v#+(q^&'[Z7!~RNhi;td1X" UEw-,Nª-,䄦 홃CBNL3 V2$挴sPHLkX`HfB\LU*_iVHx PFDaɸH1V#I\pXc"@[wH:Tp9(U-j %[ȳ$]bdWLYR3Rגcq)8W=3sF_-Aiڄr.%ªCP, W'H:̥Y}NaN@ȯl 9@}Ʈ7Lf,MMz>G?~r܏OLz^d8Hd&C> !㠮ܤgeh ;"H(φNHwyuZ>sBKF[+s U רn$Ғ4)rmDbi6e͝8iYs'ؚ; 1jSQgIsS)>˜\.1IU&[lm:N{Hzm6h\VW ʓGT{BEEP( in &8PV%!c`qٙ8]<ꭷ5 }&!AIaK2 TC)qU<ܕVj7 8Bg c0F C e ,:IRRReR>_bҨ$Ԛg3vqIeS84;ׅ 3ak?{Ʊ =g_!ky`s`#0cq#<$85D /Cpzn#aOWu0R*)j-:VP2XZ@xߑЂ4a&3J0^::H{ҵT].^<H" :+ {v7W\ؖQ΋SdKڒ@ Hji*@tsF EN F={rxQȺgkna'$G1Oh/=x1>31ftZi{ИkQx4ns۫CV>ǚ=EET=:΋k*oЄgW9 7*CIRKvA3Xঢ়crצ^nPJVDH"exRP Kw0GcϫLEB)F\98Ny6XoHc4GY1饽oV"対yIuGGe7kA{ppm/P):OV Jg؄vtiUYץd`+ob7_\}ٱ*]2B!K]m]]uC ⋏<ش+.[3jPsY[&5kD1a[oX`H퍙1;~Դԥ$p$H K L}K]@>?-c| .=t*:<Ioep! Jɠ]<yh/0✇\|2k"=FȚSoǗ*a*4BBLmE*oz}eΫ 2hO) ENn/Gmt-JFkm,A"&4EIDRXlՐ[Ki,T\bM$VhE7%^Ф˭޼?T=9^h wiHp{Ѳ/%A*D{+$AG&t z(mřɡHg2;qܔ />ؼƢdl;9< :F3TơP,HّmҪ38s`Dk1 *_asuSCGgeo5W;\:hs\ Wl:@hR/wtq;71G&CWXI+f[xcF$X18겼y l8/?t{3L34X i4FYdlKւY s z1Y$M!x].Ռ痭;r%$bfR"G"zj'Ih tDX#>Suü84jc|dÓPjYLQ 8y?~}uŋr9 Ϳ?:W 3@w⿞Li<3Ж o#7H+pdž?ㅽ;+{)6 H7@/.Gt% |E30ȘƘ"Y4MƲ:oPpјo )h4Y(l4^9wtb&b e-̥* kFGgC /21c 9CT%P ;"RĂݲ+< ##ܗ?0ϿDv2sJoX*8F!tꔋJ `SHKl`IPT5 ob1M [rPZ8N;Ԅ hb1A΁D(*kʤPY0}T bҟ;HE( V 2*&XєBh[#`<gTpJ*%mhpӵ 砶 |!Ng7wɋ*xT2J+ ZjR "ۂ $TAH$"`mӦ6 :6+kJjS!*kjPa*kjPAUYkmC%<WZ!(SYd0IB\codAε<s(cZ&*DQM$=Kkös]DZ\ҵw."ml1e;8x]^ó@{OSgWX3Y~w5i<~[ƫ_Lq?'x {+5j:Jtp!imH +6!B~2)5*J롐[5íbG[,9ۘ':ƓdifEGȺJSWRSoX 2&3jGQ;!(l/7^^|H&VawSjCBfyqi %QX`Σ-pY(⨣gA$i }ܸ+< stGϤ{4L75'!Zf.xG!MZFn.C:+7:iyd,,4Zwo]DZsҡ OTHt&D˹qQ_spQwH FTʙfjjKQ (4VS+&:&:{*Ͼ0V ,/y[g~2onü $xC92ASsljiejZNO+y״nhd@UvRx<:!)vt8Rn{hS;{ 2=a (xaQ[18Ӥ^no:{U=Lok#V\j/reP'&UǤ:pn9'U5Vq0Lj?es6D\s=A Xp2(1e1nTޔcb$3N7RP\Y&]\N"J{&,}Jb unCP"P\PI 9<qR9Xf!eD>sNr|.+:j0QkJEIY)0+w!:j(Њ??` jAv T&ۧrөd* ~K$rF3l=o,F͡+k)Z=Սf~'Vz>}?.$@.+$Ap7*̑Pî/^WbL΋S=,]U)*r9w#TaFg+(9= %TbMq;f$\ ~M,&oy>QMwS;4p,VmO-#F1o{1Fn:U:KTp[wI}`B]U5OWV_}ZnB q01YEJJ:/͹ ¤FAZց^V=C8}2#Ϳ*i!EݔƀL5Z'^>a6GT^(?ԴJ0)\3atd)ɑh%'%8wT6}SS-1Ө|Tk=JoٜfUJسՂk$"jqdJ%6b8P ҅R~/`W5?AMJ1"dP (xGחo9FФk5YɇQ'D/A߼{rMܙF/wZ6vi*I{^mʽI;:Y1b֭ş7zѬg8 d~8Qr@R2PsŎ^, r^}rMQpkcyr`5) e3$0ӵ %Y WS 2%o>'8 gR1ɽlc"php[շgc9CRZVH׌^m'M!SEsLXT.!l ?ivΗDv)ͻC?x&-%;fxv02ϯ:x]p 0wx~;MgLryQ*THF3i,N$4605PxId;&)UFpr>Mycn17ٻmpܕ2_<}O3M~i@;Yr%9iYPLIL $%5dbKo]`w1dkŜ!o^?~> 3Mh0YHMä* jlXx<| ) і/'w/ACDIߌ&j )jO!gT h &h+=ӪHndHi#Y6FQ9ZJ:6!Jq<@Od)J c,/7f4^b`;nJ& %)S@pθ[cpTb')~Mnmrl`ܔmU .MS<.'2m5WL86ǹfP)Xy vf&(v:)]5:y3^IQ?:  C,aWvE֕:4Z&:#``FsӚpnjGlGQjM2V=y5C8dqF6L"̽$բ>( ?_&Ό]Cøzo@0>BD=r4$X] ;T:>gOg?x©xGWg,WE0P),dZ,ߣ &6KQġ{Xpl2S<5^J=eBZNWW>8j19ՙ#DPsn9=Xyo3F0tY m1 zK[s8<'G 2bpyَpEk=E?:[uR$q%mHbSesL"eM8đDEkyŠ%Oh[#[OҰYk P+6Z%@pC~j!m/tLyaς }Y,n᭟#y[pxٮ80Q׳''V^=~;;Oa*vd;9`<;@(S 9q9ΜϹ!6yA0cL !˕JID_bH=8$A<CԮ% T[|FU[a!GwP]NN-&m VZhDe$𖁨z`Vo4A\z7&e h!6e*w?J*Ayʳ/< )e<'0l|_u,gS/U)IC5&c(g%RhR!,5CP$w ,GZ\3k9Ue 3H2/(hQ V bN Rj2Z Zp'pž a _mks¼W 0Xce`)h$FbVaQ zGkE`M1[YdP,$tH]F Ī, ʂ`2t 앵(\/zr~;oo6&R|`%Y9" !*wt!ΓyPz1 ˪~@dqff~x.iv'-@5/+.&E')3+o ErOh:V4%,)SΧ,6k_aAj5OJNT2J/AȜ!mp sV#ҜjX:ZQ*,4JjBwH^I9*>JA-RJMY`̌Sؓ BQ ̘Ю'(ňªvG-rlqB&d-<,zY&.k,腡5d%S| ]FPCtYP 6jM9A9A5˟  2(ʂ+*vWN{xHYpѹn޳'x+, qyMY!#]JVTiYSn_VT3:kDdR#VnL3irj}sTZ\T1P-J7ywyxȅk@0\tw*6/DB㔴~#w'}>CLB;LOtSY⬻Ksj״ m#rOP!\vh@ʩ3DZRy woTz[ZPdz5"q"=skr޹ (!㗂 ˧{b5m%7ਖJ9zsβǎ1i9V9D0抝y7*q'W=;+蛑\{f5Glm;wFnbv:z?enBecSJ7@ B/[x݁ BY;<J<ڿMKC itL3'`S=s6 ڙL$-I3JSg. @kFrBھ!TZ!TFi!T+ł)űYS +zkDrxWo߿sMч]vT0TTLz̗*Zȓ@iG]~AܜP%8R((OT.AO>WOF1GSLȯ`Ei<))GBS̓gYgA93~ gd/f)b蟟xlZʌs5m:¥;{A-jP5?vvuE%yvCG<|W'~6KY4I\mH&Rq[ @ӫRcd cr~Q]o_~ƚ]= CP\k0_2wU|& u5F֭ʺ mc: GGK^&7]|tDv0X#ÉR\s4?u?Wǒ0x>r拳Rzl&e9C5L,,9?O`b>75~Ue$3(^v)qPbPGtbhNwC[|Yڭ E4FBg݄> Aщv;64/_vkCB>s)'h7pL9hX NlMN޴[|dMֆ|"zL Pυv3^YqG7߃ͦc׋ew58,`SINL`F9PM6eHfB#lƇ.#rj_Q|?N;g?-Ezv(v?WīU(V߮xHuyD|=rQԧzoVSWTd=_Ev͊7Q =}&#I\NfV:k bi/WEG㮄sAqGPEfJՙvF` ѕ(E]fjڕNM`%,(rK0߹<_n@.m/*@Jۖ/&Zh2_ ¥\d]D-_ -u?=M4 tb}R%BўYXR\)SK: )Ue?ބ_!%^Nxdeyb')~ Q7AnDݔ%jfS׹Dk˥B;i3c csPn5aN~ yW`}Nބ"j؎Up"Cl+t-VA\[x5ʣ;?^l *]|7_mU8ZεSA "#3^5w<*Wjx2Rʫ(Mi:a,Kwh!Z2_7f^V^ BpLgAL-; 3꣙]Gժ뫠jFZ=8zEJ|ΤSZ)$D grKMMba/5R4FD!AwDEDK\Z,$^<Y ΅̜X FQeG B9ߚX:hwty#HG?3Wa1M|YРGRowk3j9S%?u("8eǔ(X.=ߚ*Ol:]w7EmSؠtIbC71]>1kO-PSxzga- "OJ Lgޥaj))ރIRE=}<ω@׬ݶً ,$GH(d2_xsoB_bU]:, Cz܋AR~9((Vo]LFco-[;R(ւ.) xg#[^Ad)tLVO+{2xUCX@7i]\ЌR&a:5\^"R{QSʮvy}uy U{_mPC^L[̜̭ԂZrToA(ch0.&p̤RaD&Sgm%3>74˔䔚 ,0 &YjDe e9*,rO9L=`Wx'HaC~D?rpF9œ@JrR8̱s}" 2SlprXs(#"G`juf LY.@d"SMd]ȇ&;.c.D{XBց8}5lLkjG'm5 CƟܣP+{B@=F#WgXjaEYghvEK\ fTJ,LkHJ*ъtbS^g7dPL4t12(oG75YR7AUDz& UPQVkr咂)< gx"dւe 8jD?W!5)'C\Iܥ޸ķ:`|eJ۠D!K4tZ!)ѡ6 \)爭T3l?Y:eR=tHCP=?g5ŝ0$Q2ۍ5ȼ;-}gwy_-qƐSeM5޵>q#/,)vRv~rkؼHs|[_c/ yp(1Q"9nt7FsYOtC'-U}Fgc |2}7v;.I]p_㽷6ȝḇ ~$`p`֧p0Y}@:h{j>;>|&?t.EZ5͡?=yOf5Ć٣6=c΅^~~s&{R >mO~MC: Op#pYhu5 Q V.TakULh[D_{A z8/GZu (Ijk2k@jJ ~YB}Sn։r 2U1Mm̦L'rʄ;>jU IE_Ն(N0םZn=LP&Ri4˴ qX.ZgT 0ӌVvDka DO %H"kC-T%ieYB- J͙S(&=DG3=ޟ9VFN@˫C΃;|E>`ާ!2[3<s}o\\ z5~dTY['|c'W05$|X#Mt6_ƇE~fwB3W?r9;B\ n^ Q;l=3ؙd 鯋;l+ lW{9LPz=K!E&Z%$I4 0%];pJw}]}{pN9j-L^_|.5uM54vAku 6%֫rn= -Zَ )_9^ PZwȅď=2?;Ȏ|Ľ Gι#GP|Q˂ 47^?gY b$j\q*>@S(ku.!~hgC!1Lq5V2'D 9.i5ëoFb}벐uc(QQQgi~B \= k-Nx;4FF`ͭbF9ufQŌ}wO/=ɜkZGX8Ϯ(|6{׺Jgg3 -6:-~6OAҷۯf.pMo?+g}Y4]<80Ӌ*/{wq#KAWd43xj@WcQ~p!iXlBy/k{ʭ .~^ş <`, *vMջX^bkt FZ`Suĩ,:2gpJzi3HK/D\x$cge%fBYiM>WAR;TK ;rbIu8Z]ƊHĈMK'|Gy!Pmx)Kn%bVB/D eY2 OW{Lt~p5dSgg_ &`]V~7~yWAQ07q7-x%F0vurEgfM*5ڦkW0r}Lzy-_B%VMW.dJ~،* Eɏb#:u:/<3VݚW.˔RUhYAfiC+L-!MHiW}0o?ihS(xNzB߻"CQ&#ȔΊ y N{3B} +c,sHF=q*BU8PTL)Y&ӎҠVk-F[T]Uvt'Ti5R>;FU>!U/6P>KS> InEk@JWh 4Uw}EF:]FF*5b;W4i=ZJVt+B~f sO~0Z>8pʉ-*8\v}q%\.1x) xuUULm>vb(x=^gNjI%4!f{4/^Z K+pJ 83`l>v)67? Z,NyHyA߼U'"慖}ŀtz:tQ?qlj}K oEbĭ==3EOx9Qv7IFW(3}Kt R)/._g&p}G!&~ YhZ0Բi@dݑcᦕ2~Y1o &I}oR3A2ݪ9NR?7`#;Kn IQ' 2ԧR̓f5x51S1YJC D3)&)0et$)Ugj2LEbF(d͚ s3?ϧFJI}Lkr˧nQ,b',>oz%rrW_w_MEK>m|뱻{4ڄ!Q$ԥl<͗}‰N3 MSM2,k_ww| pm(iQ3ՏR 8 Vo~}ר*ḣQC򋃥,knJC( NajXd8F"Eʘݭ2x.K?5p Xhel1w)Y4i:AuQMtLvc(R<" 4V*(XcSF@t!G]j Vp(eE-J ZqSQ JEGU"X>'FWzjdr]hEN:djPMk#_?b3i\V#t+Z%;4Ov̾87gYM$pJмlΦ TS??6*v'ӿ8FL89+M-P#1,O[=]>AͣYt)0BO`N f U? a߫Pla3|s{FVQp BW̿~?pa8ѼwF.(:R@i0WVAɭ Z'erJd04=b'~ߟ߇_ͼ6G/Fxt  Ok\6 oM+ɸP/;oQ0,ֳaf^'z"C*g>JS1=lOu!R-(;\Y)BB:^64nT|a%ogv?ZxVa<ϕjY_U RmCwz>U~^0L8zO#D>.#Q^˙w='Fcq@, e~'$+%Ô&N$lRK3#7q7+ͥț7V >wΖ*9466̀'R$|y^`[ҔJP.atK"xbb DG98E@ ҡ$_?Ep=R/c6Kbe$ g I cMiƽvhM&,S؅ق1򘚛W}X{gc%0H$#QbC˭R#4Y* B j24q UYj$͌QP:6an:&x{1Gu A7t9]4eKUg3b@S;ß)LR̟zaIڊKl=nόR1z=\;*IcQ Q0jG{R]LYGf"o͵ڔ5%fƴr0CH=hU*"`|Ck#"# ?vN)Ywl9ǤE_\͖9\~݌\qR1Ol>>|pY C>+v0FvbpV\ 򹜴5)D&nwft-tV}\ ka O,k^Ye˨ռ|:?_jŨ*M+Owd4T GCY?޳aW xs.np e:w0fAm#I۝8, ;qZujQ/8sjaq&1nf}-K)NxF AJ[~}}8ȾmWW|k8tV֎6:jEV%כTkZKV1?k+/waoݡUezoipZu}2)Ok;Yܸ"tN.U] BʏÀ_ V'3uxL&4&~.l.c9MЄZT "FADTDGT!(P020u^`הQb)V2WTbK 0'L8)E9qRWV?~sҔ3.7Yg;ӳ.+ cʬah#ͪٯ#Aγ_=6ۏv1z~Ng:V8Ac2K_1K^D;Љ@rz)p?[UުOhwBު W.Oa4<|~qջfol+%_7q{h]Łۛv=2<~c]|"S55};2aWmЊM'i+w KҎK鬻x7ү&M-Vݙ~{uX3iYS>y}Ӳw&J FA 4t KXf)U( }>9u͕GItgonehER*$ul|Kh _er`NAIÔ·ihbASwzXf:ӑƑ}'> jJeIdUɢi(أA~'=TLvKIP'm;]Ϟv}m,ٲpvqV0^̭(*a|:#6eS"j+}ǐ}>%')3s:Q*>~X;U31-.4/P-'O{E y05@,Wh뒻8۽xh.xo4| *h e-`Ó"(1hE t_/\[9e .ОZC0& oQbxΟLlDb3! 2;rO?fGs{ Cb`"B c$KE$ly4dH-GԲp]'m#;y;f+pE.=Xwv/vGJ$BZ>2z7BA|k8?0F:6Jmh"#MGh1hD12aP"NSbE@kPHQ2"/QJ2o'Q‡O$6ЄAZ>m,gKR`޽y;nr0Q0Q5"WγqpDIҞ^ mvsh ȯv}}x 0M8sy{VҔ2>7%DTHE:?dsN4ֈ3٣4 BsP3oƽ?@BpbE!9g_({ޕ*9dygRȚs[SuK`6a0>srSUz+G}JAtcxj=AM+: SNBWͦP$xCEdJC{f\\r҃7Y+ l\@ޚje *o#;n`Ìa:AsikcWwH:2gםm7o󋳬_3sTI &ڭs@;㿏Y*Qv*|%|G8*vB \7fu weϮZ+$ɒzRztoCugDi&쯯\vF/Y S Z7WcW+OU|#1K,a"ĒE1pϷO+.4xGxr[ZTF 层8"./H (0!1!I`xJ9R+_oZe Fgmp q* 1 '`cMdy=G~ʛ*$JU .jibO˙5!4WtqǷO+Y2WlRJ,+Y.>?n)' n L= I)Z#(Ĉ$dc Za*beZ%)(Cnnars)VL:=疋ꃐR( $G,˵TȜ^?)}eA:)-XI)vyRTk󮝤"?)^*aIiJ̛#R3wAH)S)+?)*d>E2JtRM2 s | 3lfu?#Q't[0pg¹G⧮f'nix8;;:3IҾdjC{ Чp>r/y'S/1Ol[ffa:݂s&v KRԖRL &@4%V"MPC d,(tLC@(X{4sa>&DTՔzv<Ғ!o G`C#C4Jo6ؓ` Kfj :0#>L+0EFXbCSCDpi H0$Ađ4Aǐ=#+b9W 9R".>Ɯ(b+fتEP-##:wL3d*]T:5r iSEkָOIr*e m0t)ЄRSMch Q CZI N#wUzI|nMkQ$!ǹ-!U1OYSc9L5*  o"a-uڳJ\\.P͝'nwkz4N_&fdTZ#)+O|uI|-g`v B ._:Vukѕv*HԖJ#A֖`IHӱ8Hy )JEB7t(q#t˃a<y:է=9 vqj\ h`^Qjp BF` 03`} :Yb\!Gi -s9ЅD5qmȁFc#͸a4D`2k 0|_,==3];`]3lG Sƽa&I|Q_Z&JTiEAxSIF]jsNQ{P|Q~@PsKjHmv0Z/Qkؐ$ ۆ"h66 u=O +Ov"Ss,7Pτ]8ܲU\آĕlGUm~\Si ++ޜ+OH[ s:a=%bO{Dq>I9qio()G;)ws ]+3ыQRF1$g,=91s/ g(28 ؕw5QQ`Tֆ%#M %Gtm0Dlݦh~/@(1|t @P@4_dXFd4 :xc6 !`T [ h{I:BΙ ~V!lQtf?;_HY>%2' `<(dht0iH-I( iA& ##3}m ;1‚}~-a\'I%A ?H!J*+[,(i.mhb-pi@ ⱉqS%`d};vU$QAZn5~"wef1ºW-1~~Pe}֑|&eSL\J}XN-F6bt 0w˿ݺDlJGM4zr1H1onc"V k[M%^%0%+*;KK=F-j1J<7ŠPC4ĒS#q~?썖! KjN]~q2E8ͯ^zT"*\#*iY[%19JanY6خ 39jg킒!0wɠV4j`\P:~; |(0ۯyg#W-PMrk||h,K9y34C (޽M%ھ4[K[ 9SyiD5n OEۉpwe5rs?;U1 )Q Ԉ2H-=Ϋ5nħ|(I~,h{~?-/P&i7}Mig6]&ď-iUeC\؂ ;UqgS*S>׾;eT émY~8mm $X_̭;5sUֲ^溼ͪChVKuD)IG y$b'ێM7F*.zt# o"G7hJrDClp\#NO bTƑ-ycb`QSa _7"IF7 *O-2) O+8CNZo )+/ x+3 o4 гeǹ#X|>[;F i%$Q /\:(҅J]NFwBX,,lb[GcJ`#% 0KI{c9[4/*X05L8+Jit#MđN Vgi|wAx@j061bդ}9Dz7V_!anA Ѿr;ͳԚe+-2]l%=|ekqlHS']VNmնV?٭#c*|Mf*\LJTZr&x6pK}*%HpnC.ʽAsAaH𷚅o&1$4],% W+L_? XPTwɉP]'",X fJ_ǨPA,eNpM_ODD"&^tMF=S\}zV?ް=;lF#IPЄ)I;ÂT0b;?5y +)r7w$'5 7,)uoM?vy>lVa=yzDLQK]݀7~djGV aS$x^ƭY5yY>d[_٥X!6.m8d(dfk"RIZpB2i V% 7'|~Xǖ ]6jK#YZA&`ǂ=-ܗ!jF)lz [!R9Υ%5ܜKΥ.ϧ {OwjaY$.p'7 a>Ɩe 2@g7.oZC@)-ɵ@sf0ƨU#[M9lFX"_yzC`(! .ԀneZ,47d%8b .5 aAck7 kdS>_cۣ"p }&]WX]0q|97 j<5\Kz>IK'dNt3ڷJZ+ȝnN+]+~o5ɞž h|mQîcpa{\y];n swOO =z)fYZ>D* kĖH'v;k|fLD0X[{̺.-"Z%=||"ڊ@F^\ #WatRJ"5ͿC +8+SK\uzfm?>$H`/*=V_?Ye N1sYx {T* _Vru?9^O"Φnn易6L&?ah ?{_<&ä_zE܎3g+ ;48"]w~l:Iz%Ȗ.K8ERb]8 z}-xZ/3Wh2c nd,Cϖjo X7`Z,޿{~zWEf1.4}carϛ!Nqj|SCsj`yk'Oކ΂ϏOg 1u|y4'ѪT E] k {` ʼnolϙA@;s_ (fb b.QLbsM7G8DI%}ĉSH5X̀w7)6  Q#\b^1 J#H348.{HOU%An>$G2UBN:IW%F 8.u3ťE PњZZRʄ=Um!ަ)fܷ#HXqGlp VB3²_z/\1TrK\bé0:)hQ!A'KY^q:\Z,/Vt9]rEaS0z(b!L֢ŀ1Jtj7LJB::!)[xQ7#WEOXy-\G\eG`#ZF3kxQdJEP(,Ssu4zYa1^Xr.ƭՊ%#n(/K+E훭)ۿͼlO,`797[c4M(t-) :kUVb)b (ɿJ:{͍v!/fZZҐkoV ",ժFF yHB` y1=%f7.gTHB#8DTa'9tљFgL#ǐ1=}S}le ;5JܩQ6ȁS!<5TcᝣhQ:F47&Ɯhƅ`{󠱓eq"MDNIܷoEqXM Tc)OdNZ.!y$ 1>%@3$o`>5|jػRў H @DGͤBVAb*!b&['k} lO0_bw~iß6)Hw j~c[_TjFMIʁ*B` z^ĒJ2gϏjYЇ%_5TYud!߹TZ}XFnN7xbS`ygޭ M4˦л%r1H1on#"N̻8$z.,;7=J$^H/e_#^JyQ%zU) $USgPٻ&m#W%wI8KVuUΩ8|KJ+,]03.HieI\oh4!!ȪNMO/j=эbPf_y=m%t$4K$)~(Pm@l לX8P`2U-'L4 {L UV*ETeV+Yvx9oسxlFM20ҎL6R0ґv@gV"Uq;1,3V5I9m-E  HdTo.: zSY&Ʀh@!HQTVIl P>Z8έ1T(iP.`mMGF܋Z-B^ܗsٗQWڲh":y<<7ܗ s,$2%Um+m.ver;9r% #Zw.2.a[$=HO?P$A/bXXҚ_x X.ǹ>"q'=DP\xkGbc\n?+ڍp-GW~`HНcۧj.VʑI[9L&V{7[93I%NjqQ/2'#>/%@yB} 7I0Gs@'@zNN`)ic'nMPn($ip)#BZYSsjp"d&iC4eqc=EOdHUcIk&h;Fc15J! MC:T~bvnd2g'1n/_ b&C9¶vz\EN8?%OP \ \l%`Kth*%=86ҥ 2gH E%8G.H-1&dqB?<4)l«6[G t^3 Nd.JFUb)BKG{- &rް`Bf\5fLå{ B7&WMFƽ~Frw_b N$gbߌ+>o7aI^ۧc0Z:zB |[SX.'9HhdCMtyRHj/Zn7j ^s@}|}}B 9|"^b'prkeq. k }څH5K}7߿{MӳEQ'Ov!onݫyo!\ϸz.#g_VIe=ןAz۞dڴ xKdQJ[X@Ɠ!t':Oo8E5=&&P9륌kBo`X34Ԅ霵d\bh_'aG5>oy$j'oSzI)(=haMa ݵWOV6#,}Tg~haְatXNxmawt~ɠ$yűn*@g*UN 9QC}n*js>eLJzl@A$%&U hGHYWYb;e<#Hִq&/>,_*J +zHܿxNЊɁ/43vE;H@ hFn+xG<6!砏zЖE AYA'Q}E5˕'45v5mbs/Zxk j4#dnUB*A*靫jW;ᣎ*륪Y{rxl)+Y3dGȵFER[e ye%4C䵬=ZÍ(!-]6?ք^-JRhɼSg#"G^M] -! 23+PUpQu}yz[܅X&w/0 yL 'zg+k[cqC% aH;]?O#ZC:G8{Og."Txs_Uct|K$0@D@[*ǜ5WDpwu8E׻GhG dT-!]eAyfkee9n1 n Ezz?,ʽ3¶~y!u:z+q?:F +Y_M?o߿x&U,3znVۇwv/]yd$6v~{wTPmS=L elq# V [ h Ccހ4ytU0nZp6* --cw")X1b?F =8^ƣצ>G2OTm4OQ6⸪06$Ki"hHQcޝ/5g۪T0,6^]qǂn =}O, A^Ggz0%>C|~=N|ўݤ.paU>{XɥAv(JjN66㽲|nr)n0Ef(3gBv' 'ݟGed̙<!IOܹ# H}_v!%T\Jٟ'趒VSXS-$ékS=ю˯ F-̱/-:E>,Lk󆝣;.paN\mQn" Kw߬iۛuCM}# _Sb_7ߥM"Fwu}}Ǡ>rKO ߸ ogR˫o,)O z^o?l)Qڭ#e/}eJa"-g|h}Mv.7o)i[\s!_锠SAޭ9S.xgѓyzL6|*ZS g:Vʃ)Fv#m[3h[sUs%$P<_ OMUV9Vx֓ڢִb&*ak>NV!B (47"vr,lV EE(C$'c)7E"1v2$3Ng߸TePmO?YUmm4*ы2ۛ!!1SŝT7N0_t<ׯD~9NOT*bu@[L 9+Iz]})z^䣜s5殤6n8g !` :=, ؘF֍F$ښ}Ű]YtV^YzM 9̘0D̟7LY0[cVk:,%.:6B,G ?{OUXù=K2$[}jRc-2PPʗ evcbwoaa7ҳ ̖}6BՕ-Nx;_1=G zYe~}q؁`bz3 [ǃ:L&n@AT7-%RqIrq k :Wh&oS9Ct WEU{56T[n,GR']i'Zp9 wzlI?;emSM9OMZF0Qig'JOO.d ( Qs};EhG&(<[r ]8wxȵFER[e ye%4M#nJ*AS&c+>S#ә''Z*gsQ; I6*s{78Y o-9Ya8@-xdDcRt8\͕QɻM:"d/}#@ؑH4%@XaܓV!_ߣfN GuJŻ/J2E9w+^hwBCr-)N?n,:fVȃb#"Qc҅fnhW:Yk<{ ۝k>۝}&s&XE'QMkOTGUK9YfkɓICT_mRMؚg{ҳP ʘaNRa 8w52^0DiD&ՐI5DIO¶Q}I5 :kKq i< -=D&Fs3R|β\Rjjsg{6k/9:#@nq$-X[,8){fIKfvL;mr5{^s3_jSJ)<)mD)*OJZR<)*OJl):OJe*LVŗ:OJ,]=s> Wzfv_AkM+G_" X'gSK[#2; [X>N6@%'(lat,Ia[%>׾Z֢Zr1𑩕lU\fvp00m{Wk3RHRƾk5vM)SpWfN1Jk~VMf$'NɐY@( q;Q/924ȣN<öFgϑ7xL3ߐjr(``=rd2gjT]c}Mm൵2X+ XUY0sFznȢƸr-"**c+XeE"B!F9YY,i,%Ԯ$2EiDornʑ$b!|.T٨Xkh5'0yv$*kwםO`Us-]GI6\Q7KIc"dk8{.eDzsAzc9ΘzTͭ`hsA`GE)mG ` p^TA2 &qLzgy?Si`N sVZ=s)kO~gOAvjKlEC&kOø' Ԩ1#eK ⅜j?;B"`*e+NBH:Ew!rWU[Y%-*CXgqj̄QF-[*OQR!*OiFsxRjXVnz.Mu:)}귏&EK)<)a!4Amc˖|t2E N31#Ϭq&EUs x}< 6=[Ϗ,^.jG 僌{{#S}wp1L6FҭAKiK+@tLv@.b͑>d_fCK}zqoȥޫCwܜp %~O~sHfo7ߟ~Ҋf?櫿\|L?=cb֎_SR*7G~$+g+RDۇ>xd"(&euvvG 5r8e*f'+ZxwuEK%/HÖ\jSLܯ>q.*TQ̴2E>`&2ma2.IJNֻӻ\|mȧ  ),F XR䲪ߤu by4:ZA>[掜K2/wh4NޓQ'> >nM]y2/61*>7`ΌI1A ->dnώ#"iohHrPZ@,JĪ%R QUO<`\I!,콜g-Vh cύMߑVNJSwwgCP+З]4lErE3Yjik? :6}j('?%*WǣG%5 GJgW!8Veսv7 "흳jb0WLj|whb,%NOV-'WW]o>$ۦ&?[OwaYqj Mޗ?=ۼ &}YZ<\#"nDV.[0Рwono2:?'6x{wXj9  /g}9&hV-:lϩaʼA?ֳ#Hlae3S`JwDN`))'}X"NO@AZ{xŽA鎄&oWL?whna[ږC=ܭBiWFt6Mt ǟoĚSSMeELVl;f+7ۋ坅gUr& Vd&ggG}A4rѐ7Y (,6l\yh ʼ|NH[#7u 9L|-n׫3y+nM6w+FZ9h 8%b@2\OkA0 iYI˂kf5*-N.VZrFpaFԑ+V=biVχEcbg9P:uŬB*_M<YV\7ϔuveC yobnJQffiPUE}0I_9\@xt1UV8Bsb態j/F|^,"(3@5(${(-8DB Q{H 2rk*RuQB$ BGTLuz7-{Svu?1+}_>\fkNr -dD;;Oͥ4 kW! n'RlySzyU Kǒ[%-ntmU &b0(vߠFA8< 6E1g_.mji> d%9{n(2%L8'_;%zNjB( {"Ȕd4Xj\Q^ҚXa3!8i.îTB%=ASKGYx]//<ݧ3HjiuuǔԐr샵]DJhLxkAswysv5חk Ɋ]^ ֽxvi%^/x&/R r\H'io:ll.&!oJ;i%8zPT4*j$:SIYЊ"F9y(ΌU^ #^H{tDBC{ȍ_nZᣊd nM |bnvOI$9?R,d?%jb  ;gB$%HҖU$]bfP bk D@]CIt:QȡTh5t`vDYrh j2K%QGKvNf 1I'H[ulw}"Y~)}J $:Yɀ=T9NV 6X~b>> LU7 ,7Vd^8V:SN;]SV&N[ iJ=P3u<o0LBsF?lx$G"@H7F12^@U4sndq5@a V ޺ΕLMzvލV+ֲ$. VT"5q[R:Q ]ՂYRZs漣#?MQڪ&jKm9H"Tx(хH@7 vc7,M?z[\PÕ7'btYuAwj\tG=[$MԠ{͒RDOߔOrI#"^Ʈ r N'RM5VݼZ$L4}HTpE[2s4łRdR(@WyXy\FQVRьYag&D MNSM%Co[-U:GBb weYp)Y\4I܀#۸DkdYYHdRJ3K{,M(UUeb~UA<*1ןM-A5(k:j':Lv+MdWve{ v8#v=-Ds#fzG'DZ6wO98˟,ڠC|_ώ&a/xdٯn(́T 2cʊ`?z8=x˽B̚K.m;ac ar8XZW8n9ـ)68m-&$kJywuɦWp7ey+u#`8(z.YuLQ9kNx 29:caڱ;x$b|ʘnGvb#!}&HlFcd<&u LĔEnd!+yΠ`$Q*\e-Y.4*"N-Za aRR 9k~E]7:m."EKxyh㨍od޸5fNXG$l厚>Ҧ3 =2y߀¡/PqvvYކGGIb.~{<\F^GxjLwꯓ;m蟰->sjz\cMmO ζbɮy&Y*+el F',I %k?p8*n%z%W8 jFoq s\aZVՊ.'%5vh Ar_"PH)'#9b*Wȝ3G^Ped&7 R yb*fSNFF ]%HεGH%a@l: $Ra%*ŹpބKԺ6/J-7UjuRi"j]" i6M$!Fm6MztN*S "PuP\qFjC~BӶ8p|LmBRAR8Tq=nHgʴ>pJ3eRy2LD^Cb>̉J&qNVNZ)8X#vШP}yH2L=ڿM\.`x0|Ҙ?^,?]j3;)?m7Ub"UaGXIK߭v;>%F#(@2#v9U||H ln>#$ʼ mȒ*W4wO-%uM46@0i`.g5xm }w9?oԵi `Yݩ$71yo3"ؽT4.j;fP&vccCmjtgcۤf OCm`Ԩ1FDLb_e6fQ e~#CAmWFVJ2[x&_(UIG1%Nj6'u5Qz,dP8*+ V)Ra.Is,8H7[EipnMm9H"475dZn,O]9y(.9R,^ǩu.c)`t˟QX!7KJvqH‰ӹFĚ8g5׼5HvIfeWׂ5b/*dM2Z8$2@VeT"rnm.J`'YǬ3Zy*ySnԗP_9i,ub;ŘR*#Ky}bYJORJ# S 9D.s瘩JGK!pe匸?R0 7 N A} 5O[@RJSM!utc,Cɬt?7'9ymH:HO Hl-SǀtT?!-^yPݧ[K_} _ rWr_Wy2+<#1ouԻiUsI,3}O pqss49{qꌒ*]ɱ^Y} 1!s0lqD'{NSN|I9*>:y'g'Μ+T9KZ(A#ͱ] 1/`F>)M#irQkR͉WDB9GBrF83VN8 (x^B?M?G2F.^fJ^ViMkE@#\e.P YZ!,Bw&Ni_eR ur b~rO- 9f'OpB;xMRxBy\맯:e)g[`H0Wz .+, (ur]!H9a.,J]VL]ȱ|ȔTKsCrjМcz'KR4$4!W9?j;8C, SK9f܏DcTh}\9޾Xn)]2M'Tt{U;Ŭݩ1ZxyO(ya5Q;eiÍd*)>Py#/-%+!D2TKisOut*ؗt ;pxux' O>ӹ:JnŨmR%DG ,0ZT@ȁq*NN )rЅɐ}EoM1ݫbԖ8$ _DҊ\%+Y.*gJ~Tk_ ]\[o[ $f= D@a|{cz#%Li"D/,J)%JseU` " Q̹G\[k62 y!e XTF 9›`E 2b!yp1-}^aM|OO:o$+K|<^N)zMO_˯?<@ j5ì\~@!bpO|ǷV;K'7 h&Lx0̔Tш z%ka nMnoZ9@zzʼ,/&o}E9 2ڬՑ-sSmS;MzIIނMҹ4^JD|-M[o)Xi+8ED,X Y9΅mCF }pnPjTfY $GGA05o WwYTPs bT`IdRPTY ɻ{Q }H ~%Υj2j-vqRJ8(Mp6tu +@ ȿ68^stىcA G{h\E/5f>7fML>?Vj&"rq@%51Ej4/jkwcI#x ^ԎMo$a:690!- ql"돴###35S!I_նzSqGiz?P'_k-Ʌj(}U= }ࢼN{NEz9PqGGͽ~np.nЕZJFidk4,oGP{PkvrwIR a$'Ѷ ߆^{Ƃ;,{K+׋=ʋa 'Me؃pud=g~r?Wzu;Gc^)F@ NK!*L+v534xw Gl8uG 22#GREI=iod"S Fʶɍy-g ]Ya!ND'ႆ GGeE( G!@Dj#q5<ט0v΀}C{J! 81G\_@]ٸ6ېQQ蘢M{!iI͞FX_1vG=|fO7,PTXI@+oɒQZp,‹lV,YN9 ;X芫Sh=%Ď%ϩ᭘B2 /٢kHc+nآS}uչyf9#vp1K萧Zc،mEz?s"mH]Kbs!$=Xaf*m/~C*\ WԺݝ?<촘b+nPx.f(O'K+knFbK47;vس;kꖭz)Ǟ(RR"%A)Gb2Gx.j(?dy)mŠz:瑩%lz̨+.vu˺؀ʜ>3Δ^J,˶!954)t*`Bģn<AN6|M'ސlq7~I ($0b׏G(z@\bɹԂ^;iLy~kVy{dy2eS>#, ":XH }A?\8d"M| "Ak9Q Q a%W506zg% j_J$?qĩҚa:&z#ʕy@_k 97w "L2mBЌ<*QZM Q"45N53vvmLc쭜fY4э2;brehj\&=7 cQJ䨚 y{],mF[#AJ4oarindo @f|3׳Fp ։tfV›\d%<#̉Jgx#mPܝ2n؁4>C;ٗxFxxuhP)ڹSqK)&aޢ]:K)4)6VtKiaԡsc:yFAH4)]PdMպTAHiZ%N4>I1J)4)E]RҗX ')=j)0IJF)E!/Q}Nt*pR "{J"'^Adwnj/֩hMlHF7| u#"w0dܨ!я}J@rcߩYEՖp̰+5-FJh.|`$/5ɔaYB:Qf!#P:sV%+@_d[ZjŔuH-鯂HhԬ_ߨQԬoFkxvmPDTa_ѫFdJEa|ގA7U$*Lt)Xx 2H)\ `@">@Njc%mQRAXNRʊa0e hM!-4-a̅ɝ#p-ʛX0eշE]b|z3XZ $J!4 Ge>}g""YSQ?˟&ndwuߙ:$+&ɧd=jJl$Y}6_/ӻ^ON-mՆ%K*kp PX7 g֠Do|wӠTSU  .gqac[g+t^-* L(,ck}kՔx~CV};%.v̤#_[*!V;>5k)/Q}Nn!')=>)i5SY 6V)ÝtbJ.ϩ r^B\D)FɅ@/ !'j|E+먺mZ7[o.^Wk'/{ND: 3b} q"#L U8R,M;AJAjރuz%Lo?RTa'ZeZ *%>- WVoX\g'2Uv{]cD:΋4ʋXMg٘>CIs;{)!tAL.NxzY.mCZKz88݂AAbT֨/NҦ8]+pf]*NFfLdڮu1i{@N |!jz HY3 ǎvuzb]=mӫ_gWUK,ʊrgLe!A(U D:5BVzZGQiliQʢsY+}FKbT AF/X|Xg2.fcMl즎0EyE掰 iva{mW53wId:֊ ZBIEvJF+n}<)RkQ>jx7uKŏ n곥&--۾*zC$kkeIb;҇\喜hqlP٠HKA!;pl3 &*lP(6ͨr@-\AdOG:^^ƃ)NFmnGE/wԢ@pmxjfjSeR fLvlS$EJR[n4޺ouڂB>wCC/twAAGs&rctGZP|W_w)5hQieYh*  +1`az F^CG5%XX1BE}3lxQK $kr:Cz [Ӕت>dwv:'_t>LѐqtQݓ6!XG ;!aL7> , PFU54Օ0" \kߢAty$_y·|8yl"M]?^Br MMyF p&:"Pto9ߜΛU@ϛ>)MTIp<4R? /9R+/!*4'}GEOߪOVGOLOQRG眕}jH3UO՛==oL㑨c#Re 79V.@79)Q[jd0ѯO/R`A ;R&sx&ՑlCxWY>OՄ-W16ݖ>dx\:Im]4˧itFCtA#%MR_݆E3|JJ3#G9LˀLTH`1 |d'gvu Id=JCGV&s#)J"3y^F͗U#F׽ĈOؑ+8h33ҔLL3Fh +1IoM mqr,j9U>>6Ĭ8W1Q+:gᦡJi qVS5C8'l] ߏ Uβ=L7~[t}RB4\Zgٌx$Jlv^b-lQ?$&j-|%_bT =\o#3J%xj /@O %kL8Z T:T;dmX$*RR[I'ۍ) Q1LI$Z(Oe*\WJJ.C9bS?]՘+עp%T^`QyUY"xĬ&JXMz"=hE%7r:b*/8\Kt}ZSe&ꟷL{}݉-fi] `DvhI~xhAN5QZb4~4!jxIZ+ ]a@6 [o|rdBIX"aߤ쬉;őY5`zv4z;s'#;R">zQkQ@m]S%'*$ĸpEw<.fv'u}%CAJD2tQ/ ?uNr0Y:w8@IFK%FCux4'cIA1QU^K(+njU5rg5XmR$4~h7'm:#JTzl)]UJ;fj&UYډV5u5dB+cd($-gſ? dk"8F)8@8'êy0ˢa+Sau?'9 [s)2˃)ZsNrjNr`朓|dL ~blXh@Tj0J?ْ* 9Y\I)k6)KTZf a:LS\:pa"CVqvXORs̱8FŏV&KF1gURLH\+jer\KD8]mJe-|pXσ-_}zE~zn?m׏tionyVopbX e}ݑG1g1~o?zom\/SUxqQÖ$kzTeteHܥg+wiR}[@ G^OˮHqRb.% uQ|M~I/*!`KLZ$l%2 u W2A|P" iZ8 U"0M]HYU,vlW[+Y*[kb(+ͨՂ/vL'o(0z[Fg ͉D1Va K8p,!]Bds(}^El稔hӺ6a7s@5 62r.h'M"c?wݼM- v\TiԺRWUG^ !I2d0ȭj6:N @b)jL2*'J41 JK +36}Oύ:!PpU?/cW:p V[{qeZ~v(IaYGL)S! F2F#ʇRHYC7p͕BWՂ M ̺J1h`,O…q:|AF.T.!T {alMy/OˮF`§x//xERT eBW7 mŚ#?Q.,Q嶮Ə21/fcf雟? Hh_k0|_.VŃֳjWK+Kys!K~Rx @Don]݄vOGShُƏ?|^oNX?M۳8ȼ7 fvqz8 y/.?ZV) P<(ݲ_gPZ*1 +;q܉{)W[$FO|=Z*R;-rnQ7^~cc?@b8c8C_]H(Շ,^:OAO-I{  MZ3y$Pdӄ\>9Pt?wM)'t0km;][CϢJٺy]Tčj6l^6+`Fe=ۦm(mLcZ@d'`8j)&lLq lv/LOƜhjUy‘ThV^r+V%IP*y*#3CÜGi{lXÐ+?˸ gRz%8YjÚ*p')@刜&ySuho$Nʕ:yTFFnG,LFCe`ڍKӧ*WI_ѨkTָиFejʺcT$ QYG,)IтK`/a|咢DV!PY{![7g:]s0a4TV^"A`I@l4T|bqDz_1ݱ< EڊĨ eԗOVr9]@94s /4lP)z?s:]`zt+nctTUaCF 9)Gj:.`S: i%dq N2VX}srzḰQ#'R_.x;WvE x^?V/ o?U/ ?}Lݢ?W?_׫۫;?AruG;qwn1atA#%M3:#hǧ"Qv@UQ{8"CDgU nt<8ٕQxO:WF&gJ>/'eWjm9{ {.㥾ݲ#+KOK㪬Be厄R.㼔b9.vGHq^sSRTHf|t)٨={ zq^*WKҥZ^, xn5( Ȋ,HyFˮB֣էoonUn)=,t?w»FffW {HXо0Rp0޸Giz=߸],}YsOԅ&1\I+O'ue3YEYFD :RAWXYAZa%omآG/y<#AB+$7SCLI bHB,8G97R$P+W~]w7@G6_IYf?eGhFwL14p?ݝ00Zr:GrZ/3dfbx$PI9Gq-W~~4lO50pue!_uiհ)ym,bSt|$o4E3! o/ 41oIz:K3Te%gV*tkd!TZ mglCŎw8&b̌@͚jC1Wzm"<֘P'ڀ@S׶J`kjGFi滏`WX,8ZfT^[;XU p;1 }3[Y9}:n+N~4* WP1WR3 K>NJ jax%a-bnqv8-'F#79* ⱏYQF[&8*9B?9@NJeua"y*1PLI~rV2d3{((t^_ڰ |zuT,ۤVGŢKK2:*kTkA/ qTtt< Z a=3ImpC\ HS]JƄLmn? U TW9<|<1+ V.[=$BDټ(JYWhX&IϛWVP@^`DQ#"@#۰3'dcD6B:[ <+E]f -X}ңRYr|V2V)NBfJVSDImrlhY "x#dIbY9髾T{tS+plkfij'-9sWǂ4긣K6Cj2<ck$'3}j4)ВHWي'ґp%`''ґXkZ3v #\ X ZQ؀ W䜭8_p"YSmVl5|\.(#Y~: j'Ӎ=a`!8q'+j'x&KKdBk@GU@4FAŐ&(\kvL(Kxr\o22@)v2:j/"ccu : F4IXO&Dl!Ci0-FaPƷ4ba%w~n&L ZVW$F+݄hi2_>>Dm51&G7)xp*;t? j&B,n^>j. VME6I/IF%Y`IzҥI:1m cfQf/.E.ï']]Po+ǯ)Mbͨǎmɨ.flM{ժ1869 &Qfx`kxwUMz4a^$+b~o_r񗏉+| Nư(&cd/B BއD[ UJH_O|,Rіx%ZM8|%}!ziy-=zP!f2 ]?<3[T'N05] a'LLq!lgzh)WCrG8(QFGAJaI\'k7B`L^!gʰ:*qhti47‚0pE&6WdzAr 1 &&]1Z SG6>HS\{$ ]k5f|j&- TH>qbz(Xc./wN}v |^1 VOwk_ZLR}x W|}?/$ӕUlXTRww)~w(oJiF٥e娶/9dz z%ﮮһ2cO/iR>>cĴS08<}[{>RZYr#2³*6 iVk8|7g{kiZMj]k޽ݍxnAv ~fH)OCLel`gnjm_*TWggnY9豁/\n⇋ERx{w9_Ĵp\6i9~{PHa~j>{6}ev[L]1%:-0^P ϤK.n'~Z+XVCUY!Ӛ7ۯ/n绍UYx!L;¶x7#c";PQ@cw!n#sכ9B#BEcd} '$dg{ߗ滧0L hDm Qd_'Ī7,[F=7ᣟbn}H|\m˧'7ͷF:0'÷rɵInI))$ަ=i3wc`LtHa@L*O\8a)|ف^y(g41?WX-M TZDBlf~nǼoKZ}+J(TXMNhG!l&ڨ7u*e_kPLi r)ڈ&ʦ1D>62X4PHޅƱABD}@Y9$kxLAE-$/X\8Ğoˢ7>IMG.iN*}d3m6D%h`*cHY%W1C"[_{X'0[Scy]tۺ]umkQVTI l+U8Tt:D3MD׺s/@7"h?W.oc3^,ك*RSGۓÁ?Bu^JN ,Y9m#tCBn SI O䗒MNL,})AbT}kh} $'F6Â~ѥ RGh]J7L)He I 81iLv)YiP 8 <& <}Vf*iq-pGgn3q!ln4~v7o$u͒Z >z{9'6m;8|ٕb!խѨn\MAvXltm\BywmNjf+zxnyciRX|zr->M(4҇6Nŝ4v VJ߽{pե_،];iKhu X D2(1A-I<ܹ-QM:;r_Ce\AW ñse{V}ӷ"OvEى^K-NRO͎@v!e#5oWӟ$mA!I(N# &ygS° {\rJG|rǶ_ | ˁh9bKq㻳eYېwt3K{>Xv3ߜ}$_|cg~\xMjB;&|S瑏:* s69OSg'TJ]^Wf,u`}6;.P3a-6jXs*xuN&>徜eO`ln/q>_\ePmeu2 7 7AZaRkuU ͇Yǎ/0 QxFK\7797iPdeՀAek{1r>jgܗ˾$͵Rٗ?I] B/]tQ?)b-'ҍ,4B&rgٔ0$YMQ7ђ 64)DCNH1^::"л$Y(G9C"A0Rc46,4n(Fkk!4 KQ !q䢌":MЍm+v,iJmŹiGT(D 4#irV,vѮݷ"7vʾB,Z>5~8)xohI(}C4*-..iU(5䧈^Î3zIJ\b`YNY`GC nI*/W I6 FEdv7U+FA5Be3 JkrrP윭'zJlJj%w(Oco8f:k:B- 6J]iflRL~n3;H"hj#'"Z2P VQ d>en" ۫T:1zp07 E3ei!LlE,H8K(jN&F Dd+$pV4XVD$fQ @AĆUFBͤ=r^>ȴ?{WGn/wgRE}9p/AAxaZY2$%?gFHdgdwXvowX,O, qJlk -*.i3~F,QR1)nŇt֙K?ӧi3$ l1b%g ~WԒbޗZS5g Hc)&S@?szwⰕCBf ) l-ohF!dX')g#~Zq6 @3}(MPPg\.8"+!^@f_B7fI<˨gs@*ﴩpF\Oy~ƞjJ';SwpjNx/3w&oZUd=<L(=_]?|zv+cTV{bwf>+w~(fMʔ#?)OytP%py<ޙ z~۹zL sx,P.<6/TIQmRFRT ZF# f2)!88F铟o&1 Q:bZEFI#T"9hpiCf)<;w*tGn:qoD#^Dlѭ|ݦE|j,cތf$[r\%혻2z>֕zvftCs)̄d8ePީS-rްefvž|B>|N=@Gӝd?|Cr?7S'F%Oy__WK$җq7#Ψw>}~\פ t&.ŝ|DﴟeƇ/)\M .Ի~x޼Z+`N=edL"d#&nBCHmi?e, #7rur20ͦ-d-gv"%*3UJ6Rß 6"9e!?$dH?hǙ?/6?ayguA޻A"gژZ01㜑 .atܶdGPS:ƄVKp{qWc3hr;LvdWqru).i*lֶ'%vVoT;Fbl-FCy2WjNf֨Jt,ԿXsA<6d̲((Ca)b $S!k/|Sx\B2~4* לsU mh:3iA4وhK` đT/W_tE- 1F61Fs G͞ozn h!G"t=A^Uj!o]`6`7 9i olx% ;]T&Mznr$ˣ .B%mL1: N3+3;)[aqr3nԔyIK@+ ܙB%A>$žaS5h6#*eҚ Ñ 7wvA pMP# OQ_1҄; K@G mixz^15-P-d$x2w_O,-4,ЀTд>#66{,$`^@CqSfԜm4F-Rs(۠0qc|yDBDG_l 5KEI 5k|}?s1|ǣJD`!'$2Rgtbw4Cjj?Th4wv̓Z?e?s bF#|#zTN:Vkc恳5 }ި@3um'?ğj(QҔjvQ)_RMT6T.QiB;NǛQQFdy<ҜJ~%{B,)$P*~YH XSzvw4zCT x%fcOGyq=z8{cQEQ)R|rѬ{TVeJD:(ztU3XN #r6hWs@`a 2N#^N=&\5܁iZjL  gִǭأ7QfOLZ5iwLpF #3ϋXQp"N-n}:&S"x @IaC9%314aڛ ?!շf-a,g"Ё 3^484m.(u`Txyұ[W<>De>~%o(c|`0&l  ֍cpMQGdK#1Zn:1"jqO[RCU'$҃@3Nw.SޗFO=_ C?ׄ~7>&]r3ߥVOZ긴Wi/WwkEӿ_ݖ0a;;aMh҂8ӆbRև C GDDf܈[mAկ=&Ah~Gbpɫ90G]@lHQ 8є1 Kqь9>[)0A>0./TTtzwTI+G>3/79E+'<#j %.ETb7WW5tjq52qVJO3,f:Fgžζ0v $fy@̙ra Bܸ }hٹN.iKVV!DC<:`N+z/ﶢrF$#ᝢ ?TFqlH[Ī1oZ9f=O6ޏ"kiaR.]|zv+cRӺZ~l,sYIjhzYᔃ$X)f%să9(/j7. U@_rM{Q@_F*34 1.}Kh+oKQQ %M8] @#*hR0 Ͻ[RKf131Ӳ,ڷX_ o͛W7ݲhRb~7x?/WݲKjG˼A:.qk^wޝL<69L`}fɳ;vsL_/E\>Yz5n`MDhO95Qt4[\eXyW4ct.fGz]ʧS8Esbju:(nGEH8ꋧ2OeJ5= ΚI= M&Y^j3{f$,TR{'a-1ԫ{k@ 4~H`v R 17*#ԟ?,ayёgu>m i}.XE1^pj4ld;q83Z,q:<8&PS:ƄVKLR_J|z})/4/e=em)],A)>}Y'KA?2ŷL)T /yʼnCyĆu\aI.̧v!_ w<9c%[mfjI6.Xn5Q89^ *k̐&6O3)/_}viF6De4GPuN@=@)uBM\"LUlޏ S%)+l+HEl׮vDs[)/P[ι8E1&R{$SP_tr;u)M)%yjJkDf{{(mo5P3p['8NN'RNS0TIu#x.YwғR4)8qG!iR@PYJOZJѤI)ZJ룐ҧBm;q)_r@m+.OYJ!1V)SY;a#HYC 5m `ϣ^+Gjp@Nz/#U(eWX{$M7DΆʻ[L=|.|쑖]* !"spQQQ( vҁ55 R:0ZEB3S͙GxJ\kT~Joμoԉ!Y:aKF-{NjL9vE.Ԥ+gTeHEJ*_orbr$郉tĥbV/Q7)%\EE#K}ak(+ ZQW&/ :k{VOY{="Jӌl)s3/EϿ/)_$1j;^ت ۭMnt5`ON6'U~@M~7/o56lJ!NC~kzOhF>F'=1iz@km6Ni&U'ܨ \i.X̭heeT֪Z,|T:(뢩NGzI),یy&hj6x4|K^iCulUqg+^zT DDPD3ՊR/ :`@lQp0H<<(o9LY?"M$]uG} GP;sd#=uޞ^$@pN\" 83*P+)?Q͝"蔥 x8#E*WDVa 9› W-w]]}]=USz~W4cViʓJW ]?/1j])~J؛f?]棿_~/o_fm*) 7k|wuᦜ1.zi-jq~VTOg"u zmZ|^w>ŵjĶWsNc<^3~oߥ/&EVFY/Ԯo4̷L\.#{&ݼ}5 (v=ݧ>}48$@KNfWscQa*j8gǠ_tnN [YFv}6ꐇUo}F̸?c/k.kOCpN)eף7xM"uPfx*2{ j^<׃(Ld61aa2#\J Ri!jod1~zuf+l<@XIlZSaPYYĺ1_0&=O-~>b,W:\PWO`LTڇwtdseٹ`)D0UGS2ivJ-ϽdYlĞDH ]"g1u@?Vqehg{Fk8^_&LslW=f`&- NWBo*DvčI[Pcbk5TusA'{N{NYaGs07XsC6G"6jΣ|N;E#L(zaRB\(J;G Bؔӹ`A2w^5%&7ޢ*+=O=ʼn ?an$|R6m j,3Ta0ѝsӖRiR ѥ`BJ }֥-HiRT;m R& hyiK)4)](\6P[g]zRJsh=]an}0?neղG'h8(R~ [>a]gFa K%1,J%Qx z<$\bJk4H7O;@S|$m4ֱ/DX㔙o`ۨ05؛)o fNgT hc Ece ǼpȾUn %aQZUa#<GKs|I,C'br=u%myMz8z$Д85cyg,}*'{>3 H=?v{L$̀`ļCiroƽ>(䐎`DT.Z!i Tʛ2Bl^VEW<]?J,[qe%*,s>m`A(!DӘ}R{'*媨VڀMyUӍi8_+ ^u͢ąxg7,5zN!=c5G=GjFvdY$qSK<@!Q{Ie8"Մ3|ѷ$3u4"~~Z=' ؜29dy0xKa|!,0Iʵ#R޲5}نxl_0'qp;^>'1\յq\=1/s$~SnBg |]ӱ վ&x Sj[h*!&fyLwh][\.9k/1S;'%(XkN~{ҋ^sGc7 9O}OGtqP*짫GW=0Ԅ%"!HѢ&D/sFTD$_dpEElsOg5`)-tG]}'2)X/i {*+cOVǁultfѱ& %( 06k Ü)]O~ҽ RTجЅd4d֖&]x >~\غ Ly~e+2JY&1h@fʳ7S&Ě-#qOA6r&E9x_iЛQsދ3 N8xPiG:erYCnωgԃ#Nt{8Oܒ-.}@u7R kyC5VQ^_hd'eǬW~YjFf6 חDq!T %G3, (D%VN AI([sFf.?Ur/zn(wow;}Ħjӥ]pk8\^uj GAY0_:7r(e_En!+%BU~$p2i/mF.H\r41ABӮx8!>ʞ8Q11rD~T,Ur"ʠt#{|Wǣt!U *18q,BБc*TI!:\n/ E5*L"fVPzx4v~$AoIhLr%keōPUS(\QbLUBX%A*-ACg4nKI| ./~싿>G#7Ւhuף8-me=0t4}(/.nAvddp} 3ej|&wq/ؑz<@~ϯn: QIPX_gό@j"iFv^L!4%$KyS`>F񐼹Jڬ\6{a/~_\>SnzgqvZ^_f߼~=P^}[=T$$$8K2aH9xv2 06X{_v`|5  Y3/mFáPyávJ[i1!1*y>J58PYS/SPuUoMzxWUë󕼽EԊuхU+['nv:%z$^SC\ NSoۦL7e|wU'<꟣iWo.Y)o~1uoūUtnn6yR;kZ;ٻ6r$rk"IX`f>!`h#[ْb?%; fb.֯XEVף67[>fMM첋aCRzsϽ\:Z\D&Xn6vF֔)UiDֆ"H_SaO)ݺ?vkʃi*휎"Hݚ/&j6$2%Q);a}^UÖ{/&aWljaS{J1uvc C q5L ޾")Wf2(AɝSN+Ù"מcFS=09߹IFvc%qc;ԫy"/(.r2wzE@XbeI,E'%f"=c*@*JYsWX"c Q?*Dg,"i8MMo}.(.Ud#6 !z[~$N5W@8*GHK5r˽-8E!<2X"KF W˥ Z*d\GD#82`"`Vp|XHGWbV0й|WDq.t۸|?,Sm/H"$j@ŀҒLL~$a"q&Bן κjݡĶ|ɬӯ㹬p4X %lN3 cّw\w/='u ^.8U}S :P"l^q 8GWf(I@ qqݽ5!(t%j+b#g^#|D{ U/UDQZjC*S,^6m'"V;~O)r0[01$ Dl4f2rD2B`";jr^,);Qh5!=&>6o)& BGBcTH8WZ91UǢL0ëY#6r\) .3 2F@d0Yf8W3uJ*#r@͵!k0ؽԪYsmvYشgB_ǻѼfٸ+?< ~$bs\~*ڽՐHrFo:zO͒8mN쥲bG<%3f/)Ί2i}~ETބ7[rM}%0ﰉo6yEsXg\F듹!}\TOG8KBv]DH҃I?f9V=IxrR'&H‘xYQRyPd%.ݙVMd&A͎?oZV%]UK|aY/{:z1HKAʱu $+#i.)J[!bh$Č4O 8ޗiZWuɩ|%qe߻oźԶ6CTy-c%K.;hoEDw:?76 GKf?ó~q %!tĥ~c{G!}# _Y{7hsS^`V!ooyIgl:;_kc/ ɘ`8SR 8bB3+fGޱ+&%B)ܽ1GN^kvKB-A*P>*Ź`ۏs!tNFSZENVXFLzYN^'tv mZ@r#H~3ppWyfDQu>eز:j'뉵HHZlp%TjI%R!lSUb)UZ%Vf$sq w]2 u n>n2u*3 w:e kΒnou:n=N-ˎot:;3gӘә 0^ W$::\=QG@]ݞ]c)i]=l59Tv;- :;*_s}$DϻkbOSt(fTd\m();51Q [\9cF)B=av5Em&mܮƨ0TӶԖjaJ+ Xہg& z@BBtlQ0}YV1K,/ 1T ->"4бlQls>sӯ8],n%o?~*F)B$? ̊GW=臧dfF{@fG|Uxz;no 0p3=/PCc)0B @:"h. Sa0XD=rI+$y9*G 垈ymўD6a֯\2T9Ą sk5a(!'F`Df4acl:RU{XuGQVݥT8oqV SA|;:8"bV HXI<4I,$;:%eE?\`b {+f%>7EcfVVW@! z6̢oY0*T $ _ti} araqz?}쨡2Kl0hrCAl,] =W m4I| yh%{ ۭ\7öqWāu,Q4 mw!55U!PMu<\&!H*P ci03a:!H`(,G6K](ғ7RGVzڅ;m^ψjU@;lT5}̃m{O`6Pw'ӯL$1;.CkPM%-Q^L|-V+__/Ga2Z*_Qc1^(=ww hkE&F"9qx{T3ՁwRV MWAt$L8d` Mw7'2FOu_&?,vz>;ޘyyӇN⃦4~n'nV|~͹0ݗDCy#a5_b@n}1 31(ţ꽛\z[\DT]I* Xd[SNwTn=b}LM5hvkCB~pm"SpUnBػ5Aalj{0ݗDYXN4}\B~p/SXv4_DYE&HB]2DJ9b *+0GbS\׏:71Ûa@;lvY(zNGxLOGjh\-5)HW=իo# v-15aq3p-ÇSHT|4iF*<~KmLyZ|zЏ~0z!8gV4~ r [+60PܧL*;x~ =  %쐸cR !?p~1lB@);(Lx 2^Frv 0Ŵ/FuqQ!rjTKA}eYOxfx[qx˘q}s{ELgvݓ/94"C)7|4D04y#"]ΧɼbG{v&!Am܋o, \یd<˽;yKYYJy}^TUJZ1J#ebSK؇IiWؤ$ńw $k )2r;&S@hn1Di-¾D rIO*zbv^hoD> MJ[čВltRaԊ\zs@k &`7إ,`NQھQԘrȆT"^ݜz0}*L>t34[Zf9`KUoj\& NM^Jw5DUQj.*0i>D8[8\*1 8"Q9hU0=/l%lOϮh)*vy4}*)Q+z|^/_텏k +)0n9~l1Cy|ƴ3Ǽ1r8 G4q4 xzي5b)Pq!SK2 x  [wx0?{Oȍ02;Sl܇b{֣ٗÞ ءQ$c@bBDTHd&2yB4[FXx__= 9]ON-j /?=.67MH% xO\M%ǹᡖJѺ6HX76"7wkqN%uw}i- U@QH HAVƼ~WN"Dy˯צ(X %P-҈{y[wuP1I+f|9DZOP.z=*jȅI`c \$1s;ix-1EKm ,r.dLpSeɘӄ)Rύ#X5.9W"<1NWڎ]E!X*G@C@U J=2r$D #(d@iK'4,=roc'ЏoK7+=z:×[=u=/1r6jK^)߬k)#SaZ`"Y028pT)95iFƅ,$+rQQ$5q8`*Bu.KfoMAADsD 9"p*M#D0%aZQCG;.($%\9a_0ޢ{v&)\dznt?U+qhZ3aN 9) B!VSB=AqHr#-H'DD(4Ycݽ/cx#o0\q0ta;Qw48(nvg1o!Iv^p&|tO "wkT@}[νil:#f/}=ǃTŦ5:>_xt (ubbN]# ~Ĺ1PF<$0F4Q;)b:<ɣRD\L)G9űy66juysӝ9ŵP'3P#/thľ+|Fd %6O 5Ԫ'jQ`IpXY>kTF'XI;Tc%"B n29eJAz:2dV2p(y(jB=ju#n]GXkTM OS u:!,;Wn5$k{s.kP-j?x-ɦ<^=Le:q˽rVy~ gWg$Z\*:t=({4_8"I D,-(]!u3HGR$M{ZH%\\pNs1oiFsăL*yf wзU3R/@Mk66dr,1)$6f@.I ?<!Hå*fDގ̱OmZ &6??>y:7e?{҃?ݿOW2w7\/???,;N]<̧ 73M#.Zlُ9p=,1Qmr^6Tl?MCfTiAt&ާNF)h!Ad Yi5@Fަ)Ha%V 5` D0n-n}Gr0H˶4Bh,Jവh-¥;Y˕ԉYa!x޵L`N(&h Lz &1tgFz(ևyL.} &BԠpq)4W=]< Mj.=C.2K\j|q\ eVPK/\z\Q`qIp)Fq\ZA-b.=k.%TӐ$R㓥'9s)n@bQ*gG3Pd ZFIQ,3rKǀZgҏ^sg sӀH0~s"4 ZK шw̒Lb @f b!+^+A# #@k\ar\jl UM|Q+1B6!LU{vl`$p >3Gu+'5„f%a*vQ qW!./ R"-:lxHa"KdbT8o(aF!DdTh6}X/YxOsq> c5M"q¬ 5J y-#pYȀ#k >R񆽥(24<7I z'^mcz6|+>gj[SDļv{|ZD IUr ơ;O ySV;il6׮ͅ:壏%74z%H?,k f@+̻/laV=uc`ֽ~>fr)4{ P@fwK8XS+..jZn8j8(Duuj/3$9vB peK^އw_="V4ӎj_pNR6PKvig~M1:)C/ ]2y/« l!v$ʔ ,΀n#j1 4{ a=H$`n!zO$XmP{x )_j&k5qMˡ P&ܭ` ?B\L#Pu, |,t߄96pmɷuN^ FJ)VP (J F(L2TjnJURt &~hr ~,n' ]O{u>ɛmYhUVyFӢZs+kVފ{x?D${)Kex{oGKcd]LG\6q3sS!]b!W <04=N3(K<&!HG r<…*1ѳ/d~]Tp/#7!zCV}VOO\ M?©WuCt^YY}|| 3&WW^y}xlL5$^mzϞdJDX`Ʈ&j۪OS Qo jOO]ac M"wc4/]wKa;aO7U &BbtϟGZ`qE!q0rͬ%"%2bPZ8SJ%M#Ēt?=e }i dUM;; d(擰;7:зꔀ,e}׸^-Ȅ/_B.m&~9}ezWu\"eAL@7||3ouzȐV:S/=և<`NMA$2%Py_y^54[6MF Y, 4Hfxք:.`yԊq1뮄'٠ D'BYq_EYVʬ^K 3h5k $j)}PaM}u5?N|ŽR"zfO4DbFPwG2!~ޟ^` M\RD*_Ĕ"~Y7on̩ZKKp,lHO\BfI_?̢g/s Bed3P 0HY+穾E56 5Z+ T*IONԺ}NQ s$@ ֲX?N`M:*b>=DGed#5rq+b :~.Wb>`b::[D+]uA 0mCD!Br; d* LZv Gv~|~*0x I]wC6ӺZ6?"Lke 1prC->Yq׫lQ%KJ`m8kS|{ϓߞ6'f| _OӟZ=hP';t(';xEB*;Nby\(KR.wwYy^{i1tqS,]YGp̐VL"TτNM`qYy<۔/Щgb V#rCvMyHNug/JePkT]kjz)dD\{FR3|1Baܣ ?!#3A#?Ҿm] 1m[ e:}J[&|=K]<7m~q/7wbac 闧lY|L7ZUay5QxGC6[GuNEC{n9FeD$!pu)#;k y͕6:F֝157ni͇EEཷ8_i7^qv1i7+͇E@L =n2}iqz;"-=|vnQ(ov6o7Cj{v3䑚<^d"b }8d^c)vR 酔jIp뺥p;)%\K{!稾?ZR)j)҉a?%`FTܹ@z#ݨUs 8㝚54ȠXc|*P~0|)m7JV:O*wB*J.or穑OST|:zݏ):Z.OEb8 >r e<# >{+#{ nL5,̯s6Gf&8)#E|DSr=(Ť{ͨkeX;b PgXME>˒3e0)KEZW(78y[^8< =7G[U֎ P>FCd7t@:Mg!nvQ~H_[kt󹑶76o6~l]!7¹U*$-栝{ͺnMԄbE<kn:]_@U0PN^ou@ ;Gfǥ~Ž%6V}wY\R-R6>c^CQw'݃^iis^3 ]u)qlrd>\ps%l=K ;9Dy.!:%~%2pojX=|I=I4B7=Glbsؔ )x!BPӭw O!Z.B g0q}{(s^ʺ r9(ƱT\esf9.$ӆv< [!-T~E[ GZ& ,MQKs;ktK^s~ovlI`Qa6n;Spt@m/5ND $o:T`!Jd B37O* BQXHƄXs{摌 O!8ex;m-me>j%lV0N>j<(16bphWυz;ewl |0"NHm?88.EaCc^Q'ybg81(,Aq  3S F8q5`Z'FϻFeZ|UhZtTrlyyv.cFӞ6(1N4]?G+9z_y/q{)EDM{nfO@q%7`oכ[|U&o7rY79 <[Ub>bY}(n{J^C)dL)sG)3!I[281JԮĈ6%DQOp! H(L@e;pe4iBDJR13FoyNCp+CC_tOnrkKECݴPwG9/wA]jݿ~c<,^VYOxzQM/@xb@z6S6ɛ-+53G7~MA 0_Bb™ UKw||>h|` b:sAL$:c"s# 9 K/hyp AusSϷ)BZG+D&m$m.9/Vyhֈ{ecC _n5!̡jM=$%tt?(7h;-:br`; esgZaРқi5`)} Z 7ol47ˀR7Ĵa#^s(|7 [D6弈[:Yy[z{: (.kiJSrX8!)WkV)>^f5[BfJ3-8.j @{q[պ&rJ*Jj#Z <HN^FtVLgk?_nXY R dF"MSTvR]{s6**Z2z'Zg8KlmȒBJ\ݯAR@RD&AFh,3`H$#zxBTbt6F;sF"ڍb TkwƕT}^gikagp|Β\_&Sc. ɛoxz2\F2DUs_vvPK)1AcyV!7DBRDrL=H=P.FhDRhj$JitL{/6IO F7Z:jjؿYԻ Ɨ>^6{T&/] U>ը!%[lNl_f\$]lWG{c$h+j?߾עv r\;P!IU̾L>=UHXjZK+^_6ً'JH^Ͱۅlק0O!s,s#tV*|fFbMo .yݤ I y"m727Z Z:`g#0Ͽdtvg&_06ŽG;k7&֏!h~$y(FNnmBdτG]u)qqaWU/RR$w۬,ɡ"lq4Hq i9[XpYR(aw2 V.ģտU0;_MH˟[)ޖVg{d]|TLb!n|:}*#v0|p5[8pp"fV,c4Y`N+=UK#Vs%" [ۑ60x%5z/ >úbf^x|Ds0 䖕a2@(Yvǜ-J=^᭞0Ljܓ"‘jd\;; )M\F)?r:817BsL )$<!}ioÓ> 1#D;YTNX 6?\6~ qloQ'H,ЪR[<}X 1JI@RUy^wq}#&)~#caÇaLR<ɹj>A!WB\Bv O nݘrbӒ6 f G ۃI*Gz>SCmk7p,tQrCnm0}lE^4z f*p A#'Z[B.]嚘 )Q{06$kL X/xIj*Xֹ%' lTǽ`3˶J䉽o 2]RT{Sbx0//6!\SAsm}@RB.sS t|{a ̩`D&k(_\eKo@0m4pڶ bV9|WoTȽCK&[ITPIX-} {'ţwogZ䁦5'U l[Za-4IS VR1Y'߆S*nM`h%zN:Z/ 4F_n"30ڏjio]=dͽ?$C{P\ {U`G@C_Z>W/ |;kǭ^<_λ_)/~~}uvwws{szǣ4`aj'IL=[5/G^{q/Z˛ۃ߄YzE" k'87pW7-!&4RVͱoM__0A>&>-By}yWWm7;+8;ܤ˻wWn/>|\G7&3“׍m؎<_ۃɻv2xVߠڙ @v>^|=қefZH"Xvv׋liKɒfHK%ST05Uxwh}Y!b |.N!^q zWSέKuŋ;m$G~"/}j%ߞvkqzc4٫@:@q S.ts#9;!q(4p6$K|-^I;nfy@Y@~b7UH(*enCIx ^A'К[Mx2xͫ dE'Ĥ;]B㼂v8՟S.8yNՆw@PkMKgW^6:OgETeޯG I"GOA'ď`Wؖ~bɈŎ]^:u}&1 u#h}7|/]<oN~PnN3],tyo vwUҖd=vOM7h}i^{;>%`Jޞv:'`hg >elZqo@WijwI PN)m,n&[Ҟe].G裌i ӏl>ߤ`}{k FN\Kq3ZhrNgxR(hIF5֧ϩn˥/3%.!iYXl[͝$%V{XM{b Ūo'3"OQK9%>rڧz1>SWKG~ *p>8<>Rm(۟^YҖ3ӧ w=4_$G׌\/GBWId&QS 7H -Xqj'Fci1$WXGTyaB<#̟>RGQBN7A٧ic` R(rO*hyAB1? 28)҆GR(ȷi/|D?ڠBs6@*; iArn^E)̍xpn,0c'B[gڊ ܣ<XCT1DEL9eZEP^jz/+6IntgO>灾. GSKaf+|jT MBL#6.pNFX0rIV4}gi)9$b6N9Kmor34֞D8:^2x/_O5f t>ִRN1Z/GfB]c곬Bc.\} s:Y +JsxUJz>31q?LG$ :gdj&_'\V>>3xxh.hB< tbǴ $ /y+F)[q(ĥHZg;:^1oR σ ?ٮs{{RL i=̕Gpo')=ak?ۍ^xi9u6Jjxw&XG;zK;np=@ti^̶o?ha.t.u[]nU v*M1h̰vY%HQV!K܆LIp8A[Dى5`482feȘHkRzPׂג IhkBl}|7ŌZO5WvN!Way(ny J C)7_jNޏY)P]<&ٽT+f4 f|8p 86B>yLx1A#`j0b@y)#$sqB &Նm^f.R5QLŒGjS5qqYvbAU=i+0TiWJ`e,31 m!.nqږw*`8Wo"Rqڹ@Xt ֙`qxJk !UQq7skq|CX7fUcb<Ftr4aGqiYjz{@d41sV|.֯AUg^X^c-"MJpM㬳+Et\^*iT\#^%KgHܵܕiwʼ yN;׬`rG*Vafid$woH wk,ܙ^dV믧h #J#1nm٠<F/=2''?DTx!$Ut1&7|軽7oeW>ٗ;g[KW1c9$?vo&A.NcʖCǻ Q-öF~Y 4G7McAS]̬ԍ%N8{Xѻ tgᔳݝIth\z.&cKC JR@9ٵ}X׾/z+aj.|}IcۧL 7hdej)[Ӳ[p<; _)u12yFB[HWг2y 2Bd񵪬p2MׅeTnUyJj{nnu]h,vy9a>0u#ak2f2xDq|=%}W2,t9w$Lj=G5Lew]α!`{E6$KN,8g+RD=siז=Ӂ߳bf^oԺw: 0V:mdJ>ɑ^0f~ ' ' BҩaQӺ7299W^{c` ]>Jǹ.,$u!-Jb(t),<a/X/( O'((wa$t+q^1Rf(OƋ2>˸ %$};fhLW4v:?Q-@i R Z wp`[վ}S śwP%*ؿw~2uNDѺk^w%6r6yzA˥ү hԴt$5Ֆ1ߜ3 aȘ݁\+|1IZߛrG*Vٷe8kȯ&iÏ3moΰ1ŮÀH׆v(|Ry@fK9< b$0f4 [k9iuߐ]?Ƴ245!w" bKp|&|DrvJ<ҌPSR #۬~"0N{=g2=aVN<`)P)[ e,E$iqͰ*$d)YSI&ySIƲ),|>"uFFZQ5$V#>͕$ 5d/'ic!\$1waZ[z 3IToxYA`긒иeowOr}7xEQ{;eQ v'|ÿ6LϠz2J"Rart]imPc͔ ^X鍬芵џ(PӋw]k+ԦWՕ4LԋYQ뵻#Pb%cSmN` Ÿy3~lެdRʂa:0A3bj RӠ A[h䚤@|.-<)̞1œԙ`z9P(vH18k3^>c4;9ُ\d53t3 o+>PwNvu&db}i 3qS s-Q%/B*ޓ<>xIeI)5 FT+M.P^)ǹ8 DIxY/%,(.+Xr .r}hh賣e$[0:ZEpf}HHjDR (u-bC9>`KX-6&*3ɴ2ˏ_@H|nhdCpJ=rt&L1|H+h0KDJʕ#W^:a;&fqTur'߰ k#n,ЏmYh[ږ6 saA\HGQ82P@Ph-5ȡ&r|B;{rLAgT6Ԭ4A!Vy? WL 6+O&4/#ێ(YNtT,j:gdFYyl"%-gaFKݻH~`=dBF kpa|Zx<7arZyt Al@1ra=uu`eseg7ruv}+n⟷ggm>՞sv"EXF0ڠ߮[):__ۋ/_\+dyݴ^^]_\'6ЋNL]^iGL__jr}ůc.8Y7g.Z/__[l|qv} aj.M`5?|wOjYu0׎͞^O*7rhkWsv^mFvӏdi%]%l+7_BngoԻƓeVtmexʽh$GOkkQӵsM8x p0|zfTBEB*ƃ}I'ΖaD3-$"] +`?Җ0K7fu(*im'ϸ)6G#H}}L8f zWC"7m2fF?&]ّĎJ~փ;I 1)lҬqxM s>F5`[聃8㡟;h Ԕ+?8g]`B1K6dfIM?"cj.> ( ?Ńft6~g:ݑUdB_B7ш6D{ᎈqШDov;-qnl۠%<ekH82 ooX{1񠔀B= 6.9.jvRs0{9n3L·*Og~?96Xɻ~c~\\ѷmt96fq?x\wϱs>{i 0׳vus1?GFwGqT/+Cx& x`ŃrvQx*cSΨ=& ?TR'㴇>V?tڕ׫̖4py dՋ9J:0eo|wMdwLv}_|Z3[²ULI 5|^c5n 1(I֜ f{TwXp Xs$qݬ@D˫OGFmX P5F4D5pF(ڒ6av븙|N\Bz;bFV8|f#llnro͸Zjn u۹:,c2oE1N cv w8|.F] @e4@>St.uZ}G.)*ˡ jQ.NCꙸ1պp+[8|)*L[G,#Y] `(k ^GB!F\/T,wjM'ɯlr2Sf wv%0r5~Mx1`aM8Q4Vzu3O22w]ؙYI%UGb^I[ĖM[x]|cPVk]-2t,5+6?Zp*`N=)ܻBXN%BL,92'򢰉Fj׀=1ra.%]_,EFEkEȓZZ)*/ƱGGJ}&)XQDxƐ eM kQ<C (ium4m",54~f)S(M6"eչnB". 4.0?FJ̬uyPxpĝT 4ΈNsr-b_{^p,h͸A2qr㽰*QJ'bYx`mp}aQڇR 60 ʈ|ǾR"Ř mcS|Q.t}Ʊht c d',#X&lcLfzzg;w{D-]4y\#$-O(*=Hn*|PY Aw%Wlޓ0=H^$>DJU=<ԇs%V9@ɪpmvKCֿqe0Lq'iUHׄsBr.Fen͈]w"zW얱Rʫ/.qhE'v.q/=usCH׸[LV"]!8n)ez L:(ᙘC] 93-Y%f>egĘL‡kJ9E;F!Ň?(ZQeor >RΓLX_źVb(]\1/J^C(9Z6r#"mEߌ2.[l_lqƶII?%Yd/gdnb靪D&ּGgV@ݙ~ 7]uWj+J0Nz[vn0C@n!NVjۚ6̩J)K@*͘m읙~xkG\a^nQ: 'F_L'Ѯ_BOr;Wwn}뻨`+|>CvKwMZ29Җ:9լr2ʞq!K>_, UN[vP!t GR{.5bzQms{WuVbq#`ɩLfh})"r"AlJr~g7>-)(3qNMiz;IE{xSTt%VUר1;b4i@ړVNsefvffJ$uDޟVDmHR}-tQ72X9F ' T+)Fؖ3/乵P 9sD8$Q@R(_A#}uxQ_){頨(h#@V0\P0XB+[$;-%Pc(2. K- c/X[2ԉ2cRE#ܒa0,}{/q,+JN 3`Q";2]`(Qjv,_֚񽾬$뾬sc t¨ cp\JS K0CE_\a] rq[dӥcɹ dmʒx(T,2#Kwl 1 ++ 1"ӰtKx]nMe:X'iH %)S[c$HT$'(:lQy|7cwPQmlOҟZ!x ݜD[S l Trd˹geaKRd EPs 鱞Y8UU*zDX/z*tz(Eٞ1f'ihL*»:;iwx 'pz=a^W^i--u.wk=>|5xﳖZBkd{P@T],S3iX 2! ě7pRN^](٣d*uň믓<"\f.aoΈiԲ3b:#bm`8DM>cZMFG 4 jߥwŹYfBwԿ'<4{eSfgnjQ׬*^EwvLoo1?_]c'0zͳft^從x:+@d-T3焴m*EdVɺآR*GP!LP _-VJx tױի=uOXMZpPkzTz P?SY֬3ߗb0cAHg畟.w5c;mzh~(P|ao癛{DΔ"wxlQ+P g+et2D3}F}ϹH/c&- gQ1fvw-WcS޿ġOOzLpJ^]]hdDz&LF84r uX9m覝It *so3%tLP_߶(by:3ial)v aT6ɵ%X A*^Rg )|QDm⸲Vx:\u#p5+4Lp' *j5pd#6|'0|mZ[o hBTB `hCJUQ(lQ2E8/ils'L荩. (sc@􊆒"JW 4ӬD#pHUƽJJ[ A(`(\[R2'xyŠS{ĩ@Eu}b P!J㝒 ɺ:/܄Ffg߄s.`JL?H]TVRQOE &"JҟͣB,!׺tC3Sɳ^OhCP*3ۍ Tuϖ.6SAͭm[ +bZ?3g-Cta?/qvԹYOO8"ez4Ȗkum;ʬ !j!eRù1IjHʛm3DE4!za"{xAxG(;p AsX{YL5x8MmjDxp.uzN1I{4GDdӖ~n;CgLfhfF"4D>:3J/mM*Ln*{ zz2мۣWX\xjd:>TGttڎt74ʮNL=,u)xH7H*PԒե5Y]` {c:N_I[oH$Ƅ3^g^D;0!*=>N~"_E_}UWU__:4."y)92gŹs R&s/G !dö:l;&Yg*^oj0C=8gx|͉Q+&UHFa'esv<@dwud-ffyY@_`Zvnm%[$]6q-ݔcJTh [Y' is-AڲhU pc bY8" =HB"=^vi{[6"G&C>'E\ ߵkB*NUOms?SK0t f^0ៃ2xY(ƚ ?sOY<=&w~F.r۫ͧWϩmd: *gWh9FW"G4U & QW*,L療&崲۾Sju}?֩Y-ך6̑J/}DpR!.rsg_~>EiP;0dl[@F|xٻ/EB/C0h{WYɋj#*MNGg" ߊ>IjDP+8N'>+BF-zE3B̄2w 'YxSͦd*ɢ%6mR nN}T>6!jnO&k$j'Y覸d2Q ﻀd/i!Vi)g:i>\ J%9v qzH/xaƜ `ܚ"DCÄ_P z[q\M@k# XPJpC)Q[JF WmsmO/5\\#jU4ШD! v^1divDOk)5fBԕX}Ƿ srÖtV8UUG1{{ LH0M$52M:N?FfdH\pD BXy\K?aO4KƸ;S¶L(N ' ӗZou"bnYzjPXSZ-[P&5-#*t"QT0ˊMBº ƓJ#1ըFVДrUy-鯓>+̂n'ͧlGz(+XZ2;~Z>~U?9}\'zI$^نEpY-@z9EN'^ #LÀL!u~⡠U3*3.V bSX^m>%=K\wqG wgf>8?*$ JjuћnhnFO=}سq3M,޵ҕTZ򖿗쭐˟Ji"{N5˰\Z㱶WyƋaxsd盗o @!Ts.xpy]O|2 r^F&zs|-E9'E]aE]ǠZe@lNߡV8GsF&c?gݧLck\F@ gjȑ_aK*KM^%u[rW* E-I뻺Ɛ)j!tݍ~A͈a!h~y,&jr/!:yv򄌒 s8H&N#EFtuGߥmsDr; 2g5}eξU\ (ލ"fw:bxv,1A)Hî}?qߵq.7 䁟5\X}?9M6^b^f*L夐րJ0_. M˹s58@ ;-M̆)DOs&4z\0`F Jsf5>T)+b7.)[MsWjVH?=z㧌4v[ٙ^wrRW2mvg<  +w jhWK޺XK }V-f6H粕=)!\ f.c@,Dzn,J."oPg_ M)M]C1EP˨@/O7P\c1j8Ε5Y5%]FG*V9)͇R<c]R/U QH(킺-D!\ʌ“1K7ZNa81uOr Ӯ )z$lA)Oجln_e_b܊c 5HVB^<[ǻll3Ė e(Ji:Gws3\YYtS{~f}ׇؑY;hJJ* wbugX׉UGǿexgsxQ2ƨG;"gN-&wW7:R?."Hq L2_P) {'w시ы{rwtA֪s$ӿ %}L8WS9)ÄJGd_WY(aaZis7㗺匰]J0RG-ZlB\X C/ev1ABpl&8v%f+ 檤b]8ڪv1 b1jӜ/>nzs?~e(-w z֘]m:_>\݌nDt)"e,5!Tc {'xWj?6M݇է) vN^bMXM#$`bMD@:q>;9zĚ3`P`'>*/F3cT|73r6))i\*s\V3T9ђKdf# 66Vy,46bHHwjՌ&xڥ7^ )Xj=#( k/B`雔hv#PZ46Fp&M#/5P**a>hiuDsJ4>Ĥ&2q;3+`l)hNit&|QA~Fby'cՊDkꘫ+Ncm@3le*AjA1ܑq#XRۘލb-0Eލ?!e~Fn_;+-%ƶJ3 e]=}'eD/=cYT:1qRSn$89 Pq- EGZNv8$J;^XčSk㇔t4]qkC/^~YɯoW۷~h4հ"ۉ遂W^<^M7yg}&<(Qzч IjȦ#(xͿo=2@?[/MI 2MklLGBgd_infW:ZoK!*hGF" |t拟N[zq1~TUS-IUn@ w^V:Z-@YдͲIc!D2I \n(i{xjOO.0u+4_3=wp|JfIg+tE wttC佀8߱kNiʋ":@.)]Y~ 9jw0GpznoҧATPYa!{{0 mB@2 <3. ,+ $w, Cwa)5šEYiRM% %ÿs4Y[% w?_NEiʝ:cc,jv{fmē6Pc=f|y> =D+|ί@&M-\&H"Lt n . E.a_IbrgO[9_{oKiU?;S6ynM1PvV+WA$7.dʲv7[Nu Gtdw;x`[ݭ h|w,lNwn'zfݭ{JևE`w}6ӣ;7_$:C&nћ]5Ov1-%+ij " #'Um ZTJ9; 㛶,0fL9G/tVn^Mmb&m|fePժ&NH*k -F)':Sz!)ɴ3Vk{GJ^3žޑs4ZMKemFrJ 3SykЊ,/R8JҔDnK eB3VcFOוi l I"1mMn/#٢+fq^dy53h:/},hP5-K|-3 f-$Ϭ>޹䓾tYCB@$rFuT4NQ zGfއlX*"fm9WF[YFpQ{swCЙ:)QgDrz"Q $g~#OGMPfc*-R28q*F$=(^7#A=h>Mh\YKBS?!YuAQµpt%(TM6ԕ*{>(EJiKUȄ $8 L@f ΃^eV7T~$̠zP U**5TPzNTm|+u6zC]5m\&g8K 2Hl vZr 6^) \ " OP˪m*4d*A9=)+>@qC3nTR8&F^V)xDS ߼JtlHF̳w#Y6#Wnӱ@c]SYv^Q;"(!&!^pJI4@^vwFLs^;]gTv;cش}"1@UX ҃fFIMΔVa>Z!{6VH2RGxI Fbܕ.aۆ%غfRG\|bQ2ekɋCtYQ!EϼonH/ϳp'E<_WX,X8A"LCwmmHe/x-yb Ndٱ'-J>rS3IdIWEօͪϫ[k5 BP3U+fyl#β(EZ}Y >BUJ&{P١e!tl(!}C&;]lgCxz$ZX^\_Z7} 5 YZ(/pF$hWN|kQxlSY>TC\] يhT ,=Abm&&T0J<,66䈾m&k *l1$}m]AӞ}#i2-2GcZ(T'T$OM&%|; ooq< R>Z@Oۍm$)[i*׻JTEqزf-Xu&D+ݛ}Th$HoQ>ZN9 ܃`;bϮ> `SQ!tZM>[ԫ0 Wfc%{)??_0ZI=9tv"}oDo aJ,~w0}2mR(e jr[-w&)e;(M'<{.m1nCmln+-_[%<5kya[ܔ]| ;vI4p^@\]\z ѫ$%]:~a2@^sޫ ٭P'H6wM/x+^c5WH^iI~6 F5y(&i㴞[?*EG b$>ۆ*ޚ~۽ UX!}Z!qs;{.w ݯ?6M{Q1~;W/Sm:!{JsP~ku:Fmx!;\nӶ\nS7tcv6kp392DՁՐBEj LpFQ^?xtaP|k9`y"阏D}>LN D+fEώ}X1ԵD.HщUYY"*)ViR;'x"c>>n}Q}̤֗L΍vw?X++"Gj$_a?ySbDًZ*kL[dҥ"E'rB%blBjij3h^Z>YGFMd~&[?v5~+,vDn|.Ǡ;j0BP+2G,lO3ŊlvU#ѪQWGB+5A@m%ȑM+B2Jv^ 0t/i.k j AD ρ`^Nx.9݌32̚UiACT9X$X D-lY\Rn)'}"TOqUYIu Y.(e9ls>|2`VQzT$65fDD)H8(k 2lQ'>Lӓ46#hN޽`֓}6E:JP³}0,8!n<\8Aҩ7g^LdcR;n#l<'4M<8;~xJ5J-@vܳޢj)5K~v;NՕSa*ޖ'T8aD>em,TI+M{l `^/8x[&q-/3 A%tÄMF|{7.;=NrF8&bR\r эSxh;?:L[{ϲxM@wl/ˇt$InbP{uZZ61";MmJp֩QHh+rsTgO. v&׋u'ps2*dYWIΥҞٕ+bshU$>:&a],} nPTu(6·u_4[0uRn^Io./8fHx"n+x/j<5xg4C̄PR防l9E3(( l"' -18 UJ>R|_4;Dc@{Z%P[JjA嗳U:Sq~nQ-A>|:̓2P!˖hBl-K=Y8ͭCgFR}dnՌϘ\8\wT<(&ԡp_圠)TuI7USq9U#\-ڄ5_cf\U[uB&0>j7.7;iej?BgOۺcC%|jm`kfAh.zUg˲| &cޭ>>qI. (C<8>{VxC-"NIlɃIZf?Ώg}wg,ڞc1[֍DMzovܳ sv',xzsN\?P,,!WրѴp)"%syI9F4)Yy5DЪhD)[pde1dp}^ K?x|n[h*j9JZNb%;d8fw ]Mi QT{l~S*H}lme6λĎ7=)@+ &EҚXkZP,3Yy!ZE  j:] Qxy8O S4)YWteuMm=bbP HЉF4$1ب]ccKѨ+aKYHUʣJB*ozo='x? - 5aG(!N$&t/kgdvu0l6" @Hqz-nSqk[!fKڃC@1vȹ&uFr51׉PI,VN%f:{wI"E˘ewrNjT"+e8@#DN R<`7Ga95oH~ߨè?"y52JS3&GM}n1?,#kc>1Ώ/-Y oئ_.X"euy3TB?P:[ ;ъQ5Oz/9?Z.19>L#xbD,c gvS%J7xk8]9SXnlm̭AM^D !+> ǡ˛qUX[+1x aK‹gn1)4{g!ШZ z|xG ZRoR / M؋xg$Ps*w4y*/bp+V_LC!@kLNH[m3hu1,+5|w i l+g9V6/CH%Vu +x=GKNuX9 >[A/7uaVNrN*ͧ( "u:bx8=݋ޠI]-.IQb)vHV&(ҩRd as/,y9N &(B@`Vz[tJ-@o˻ꡏ'% "[jl ϐVKVfΞu趭V |䜩k YǪߞh# VTQHZ^xAG3${SfMDjPR&d( W7`mfW㡿: 9R1 ڡ+"B$:q;+r»Νh+:J] "Do%V yI0"%A2_U>{aGf_?CՂ0X2"J |^s{g9!t9e,*ma:q[x$qzP]/l{1\m?,pߟ_^xH+Pdڑp/jRtɌD[% Nx4Ǘ.8085yNsH3bBr_8iGLJvn·d9+r,[>^$'$SfSᶪ37ʶiPGhDk X 8]Lp oyeNp-{8/84uMl@⸿*'ϑ^Cir:]Ys9+ l 7DL݉v4 :L47oEIEVQ$" _L Y/fQPdsK.a4bVHG/581mdPKےܳ! ń'){1E"X# =m/M7}=&%eH@R0;ty.\lby"ctD[idejLP[eO2XAs *Va(.y&iY[=e٘r>k!%&بD+i`VGg6kc} Қ*y( nx[X<~r?Tq0x|^2 Є2]({:JW5H#)ƃl{|K U`P&2(:6 3 9¬&Ǭ5M䫆Z_lRbҬ}f}X_ʽJuݦwRǧJR^:l= Ǿr"Z)"Sa8-BYcFo e_w()Ipm7 IT#cѻ]]vѲ2|, @q闟Ҥ%v֪u|߫J-g8[ nv81h͛ikjjE_Rk?lt}2EXPTzR!c RoQd̉5! .r`ɽѯ[ۡA6`'ĖY/9H]!eОk%5bOM{dymi\ĕ5b*,p&{RyXf. l%Cqz,fjט׀ "iܫ%:J 8.(SfY|FV9(YJY>ȨjQu #K'B8OyNǸ҂8lcPPJ=7ʛĩGF]k5^|E1H ~Zb5i]`p!&׹z/(Mɗїx KI 8C $x5뼕-yl|etYl$ ,ʋDI1]p`M35ۅ53(D/b) !QPUrUK]qI £Il bըyM%4lT֔EY\FmVr+wEÓ2,PXTPj :dWNJ, {}1sW G|⪂afWot'NQ覄ow?\weZx6ćQ%rH PG dO~dwMY۝1\d ˏ:}7\!%λ\Ƴ]u b8± o{AWQ`GYFǼAt,[gl5CJ;:< |FU#tpyɃ|/TvӫP3a(ˆJFZ~Ulls)-cOytm0 ^6&wL0NyyHwRYUxߖUݏzv)Qvik0\!n61b:kg2^W"i9s6~9[{t^W]Ok%fP,r.˔f}¶/2CZA`MzЫ|Nk.\vs 'ߝl鸒< xK.%Je/g[':#`_s&V_9q-9O3TWRJKyi3H֠Hd Vru%w ҀwڭM>#EnTd~}$mpbuPsrWZ"R$iœ1,[PӵvQZ4}wΧYX;GGO2wm[m,rnx7Ti{KA^.?\\kѠ{"q44\ ̱|T*>a!jLz[%NGyv_pM& k`k;1eM}4l;'w=7;lFܽ۲YՊwkѿqƳF>sw|wx~Mۣ6'RZrZ;I173Ţz%Kdnt3.C 3's&VZ7.d [ߗp[SXguۡeJYg[ݚQ!!o\DTzV7Í؇խ) Nw20ouk~ֆq\*nE̷Ӈ~rEz`4<\vњQsx<Njw/5Rd_  3pEkY+FtR*Y=){hTռ xWS۹2kidx.{G{ `7K~9q*R/"6y4lxu%c ar} dL-`̣l=rx"RA/$!TLj % 5?JI&^#lTd(i͞cKeh(5{ )4XǕx%Q[t]a~Y[r1\֣"䉔O"Ht*E8^=VSrAa}ᅈ|П|뙒MM&9Hp$16 `{Zȕs _~Im $IB:xoQC" 1J4vٮ5c[l>rQ÷ `Dmȝ]DdV+︷^[j,RS͸)WB`_6V[G֪[2 :_;vɀ{GeVKr5IwlOfrеIJAͥͩx*76GV]dKX%w!`I-ܵܖ#hлԓV@0^)1ָW n]PM1Vn> |(8˲70< 牛_xϯ\j]7 \&)*3C#qPj|.gNJOͩm5^+} 2`v׶vBX'1CH2"E.$8r6◛vӋEZ-~9F+/(]mVgUuɚrs]EU_i\JЧ[Z4~{s>1K2^s]#pgNClShU\zŽm dU{569a6o4$Md#)\{œ(g]g ޔ|ӊjx|'fmzK!k.u)P&[n UUC49G;1jtR( {1c/MU$O3CYT]CzN]<.VhuL}B.!Jwc b87'D** pB,>BScLt.KRnqQT4F|^9h&cg 9L+mT`qqt&l؜5&w(HRN6$‚`Bתi0]j/]"kת].2[t%BMyza|YrsPKşSx%&˕x'Z@ ͠ZmƆ#j\ +:\1 4f$x&+Ffc].x5b\Ug\6oYRTRVk[CKt)1Z6w]+?((PKTdC$R DbpU#5EZEx|D= &bǬ!1Kð>-UԌw9-`zX0[VsfYhzhj*ވ$+!8d#Rb )e!<%Hq!G-B3k9+o[Z6e3C?{Oƍ_aKfh܇)Qަ^I%/^\YTk I f0l&eY怃Fwэq/'* /51M !H3y,bZi!> j-.էHa[ 5dC ?UjE#+|41fCrSP_=!( !EStvrV)+)p/ulfWE <+ތ'OÏir߹ۗ>L){)2$7Jj8m3 b_Rjq /lzc]Nr9/-m@mXHYb%&ڰEMtp[x w,9: .*Kk̥'̥<1b7 TEiP+.Iͥq)aC)? .} &JsKR*ME/ ;(IGhƥ?jB:46D_EC&9Ҩ(ņBkQSCъ2E4#emǍGQ乂# Bɨd.OD^ ~r~AAʆ.3*uysU|9}].3\>SѨUi 8PX! S3E-2J(!0.ʊf ۉ#2NVКI@ӕD8K̙+˞DbXx]Q6)n{ vOs#9iגj`7NLrdj6i?zDQ _,6Fn'!%QsaJ8'TzWi4W[@#c@;1aWTljU·V(Z;%Q1DL!D>h%ė*{d̂0W) Hq+]E%i\yPfT_ F|ŭ8 ~_iZaP- 1;_u+|Fz cE_W/j1xZ⫉mz{Z"@c^__]DLgK8iwjz˿Kl,bpSNgqԂ 2}$R2^-+,l  > \/-?$tM/Q7ĐUB)*:8U U,*P?@"ʔ*lc9F5 c#L,,]EALT#WUL| >YML%@C+B:|\5fw[&(.9#k ŠEŦˋ>.[N]`_Jo\y3{qK^DaP}iטbսʁ619XoCn"'&:u-75}n\wKW}-EpH b& XI$Q+.HVZ f* `@)c;AsDvX(En/ܣ=BR' ;\#Pđ^ ݵBcyi);:BӮS*N!geW88mT(dg(c<[DD${Njlwe^}z̫OW ld{^^%SQ IFwc+owFS\~^\~-;c >99lC{ f4 4{ YXM` LʣpB 09q% ҶS% m%7UjE)j sJj5`Ǻr"oELt_O" 2D8z^9}Snjf<7!BAr_/.w~=ȯ?ȝnex@&MG"Mx&Q]$w?T&Zg]_ӟ+oj}zп^YDŕ_OhO)uc&#M-);Gv@%hJ'#[h3:`Kcej(ο;-\0Xw z$W쀧'Hccu /$+{ʴ4Y}h` {L>FbK=B+|v^v90bH;>|S hA-F(ۣ ;'0N`lAQAGVZ6흊Mڑ@ њ]{=Z!G#gNqgHLpovnKjqyK_}'-Daov a/] %fPKʢpʠ\vZ+9Y)A͢sS,ނKasp1?/LLw8~v @*j4uUjLxsq˝p'?Д7q;C>s)^|,ݤ8 開ɝt;h9Hw-$J>83gGҍ^Q͙8H ͜UTl4)cTsvԨ83gO%bby*a% [VxjJ4]ptvy 2HNSE7E MgRi^O>x~LO.o~_~ݾaH4Q7\ԍ4F|W]ӻ(VZ pi+ZZּtITJ4i*DŽE܃&o%`K2A7 ARѪ2B+!hQ'qC:L)ng9k*q8dxpRӥR4G#2̭DIu¸! S'gb_s? UZ`\g/NIC:Zf6$(n~(42^#I/aI(>]TEO,bYj=SE Xޏ!ú7N狮/i/Yݭmz9_/Z}ֽgQ< nIМVd6(lQ$x,`&"cBO6 Xn\cka]:' 2_ug3BAI)kY-H_ZkGE"!|Xpw{t%&,sPD弡?>*1,29XEr P)1%Vxgw侂ko4CڋYH '@ 1GK; R*$]"PO >jA0ܦw!ـwph{ju.//@Cy3]x"_*ՋE{ gl3s7XQ_܂t (4{8)ƾ$k]X/ocS? ˋdn|؟t~3[Be.A-=¸̥8_{PƁX{ ?ðw򷛵v17 IC EC-J169%7P޲Tޱs(;@"JI#h 8Cjo'ٴƼVl:_©pS?bKЕВ̸`` u$ ?ꔗoTy!#oMiqDRiIvGt0-ibλ#",2!: L"0%;wCJh,E+pw` hiBc_ /:Xl>\.}^%% Z((((f9īQEq bta^*rWB'JG_z~Ip, 1z܉.G F]_,+UU]m䝹{LuzJ EWBH"? $"i. C#K4g\XFb_VF{I&tDUR"M"+a*AqYR[wqVZ)cTjM}XF|WnpCU'F.\;_hH ߩw??{ƎJ %^4{ .vyxȲGsb%iR[[vwUUbO|9 ˿=s*`yZ^M_ssPkh5[3u_'6>s>fxாpL~ͧ?id9WS7&ӮF/$aancl=7ۛȶsV{DwЈnO^HA;;*^O6 5Ln>zFq8ť5gݻB6Adzt T,>y>N_~Pqq$]Ol@NK yk8Yy8ULIswƣR y7F !wS>ٰU^6A?wxsF&kKS ٜ + ":DDO\Gff}B^IG> "b7HL36}>5V>lgYGFPݑnIQTYi%0lvh;:ʼܐ pBQ!eGF?VMMsԇunHz,wߢ5r`4ԪNVqw4t ] f"f?ɓIa0yer7DPgJ%NW`(&$y_,mfHgg:EM$/?X5Քɟ)1QrFw4'::,47{ЊRH~'Gw<0*a5S2:W(,Zg7L*rFDq VzV}rJR;we9֬IؾlBe6 ؙM㙑+"!Civx4y\E/3FR UY(O2 ŎF{d|%yw4ZNa`+6H'Ihz1w ]v }v[hy bZUJFV3ʴE{wq~(Y[#*G43ݳvZ\V/J!\%`ߦUnְ&$bIc (-/URªuI%t'buJuq]Jn QGAZFO:XzTjG H38G'VjlFџb:5TuSX] .Lfu9B5}VV#^K;p5@+R61[bjbģY{j5-DefҪ䛑}F]l$]Gۅѓra<,䧏gn%lAۀ4G&u 1ʗWDZzߧۼ:H2tܢD🻻cVS ]ODXs&n=wwe~?Km8u|P'CKlXK A},+]b(=VnӖET[ңz@tWYކUEvBQ '苘!J+3ZaeR0p9mF? GP`wW[ک֫v^sP$J܁q(i/,$( LDƨluI[)> {)< (Re RK5I] !g(*xmU`DloъUMͱZ JS%K G VmO 7- .4;8$rŗcUPtEgHKSBr!H~K$(0}4h[^\u\}q[F{z{F;\ ^N>\_VDŽWw\vwwcp 1΋=Ž2jDR7 g6 P/ҎZ%q58-쉳)O\ūŷVy͉4ONV%PSfϦ9ew0CʌYΩςNa,8Gy'xnݨf(snk\ԃ/E/uP4(:kM^RfP+@+<~ɠe@b?`@Z=-:7ya(MKT 4:!׭KA;/MIa0yR<2[BEBNmѤW\*G1hA8 dMMH&Ғ! Uh}QSi֗oɇo=b.-ʛ5c&8E,&NǮN!1r\锩4z5 (g"Y o2BB΄2̾&j kQ*wfPng_rJPnW.LEf佶YFҀܞ*STߖai>z@ʷ7㶒~vą_= a(SdIgp ܄^E Egy/I4JN? U7$9{eIgFF8\_V''>PI__tyhѵݳt =4?Mξ@ryvzfoziGAQNo'}թ%}մ'xcXfw}vG9lq ϨjK/s֜qtCG5'`1[<I-)奦P홷F26yFJp(Z7vLJ]Ec䗆/ujoAIEi=Qλ΍H%^Haȩ; 8%׭)Ia0yR<21&2$5LQs> P'֎ !B+} McuN닚RwZ_%'r0!t:'5wk}sLd*PeҊ#RWxV6DŽ0^d]v?sҏ0dAGV-n?Z9H]j $ v%ݫe=*2V ň[:*9(;-75ήo63u;g0/vkGƨӪl:OoD(fNK On(rX}rݍ1zpT*UʪJHZ؏B5- I׃@?ik)eZ2nM\̅X%!/$<߷?̂+*Z [<&e r|㇫{_iIzRk6H*k_FT39߼؏{ڿk~_⇌ns~^ %Q}/ ^,ax2qQ4^)g{pJ)V0NRIFM'I@V9P>U"^,)I9ЌTnE LNIaQ~M/Pwhu~(UJSE3Z@kE_*lwfaEZYf2+"$bvڡIqF% &[4^p!#vqגnp^2=+w|WN2x49P? m*$:)mMƽnThVG~PP#ʁZ:,B:'t" +2׳KsN,DS0DH ߜ8BO" xT旃g %X w.*J.+LNzB</|L7jFeїjy[+kTp!ܗׇrb>&&\$х@ 3MLt[bRV@Ʀ+م`qa?2,⻨%gDˊBC0\ |h3gy.δ(plm$Dr"2Hlvd-f| Z)%3{$@n靖4W0:,}m<ځBd w^PNq|JŔ8q'(td3qh 81dўPA͋VdhGj0?%2`6>O,$۬, ,cِ pg0I Q*ŇNX- )_{R"1OI )Tj8tRyO j\\anzuQ/~1DI@I4x,^F >(GTZh9q[[\BI"B^ T2e&;Ot]-r߈zAn c.k#!+-M,q-[S'~5 hfllRM'_yZ}_rC&85+/h,hdrќLGwCcʞs( N.xHq8T>u2.m[餋!C'`mF:t,>N';)Ԗ#O|@ơ];}ԌVI#n9Xj%"lB@qB.֖eV~4YX!s%X//bdI3B\Cj#bvAd5@>29p7 YHdqlne nOUXE C5_s7`Q1\I\wYFُ$c;)n@,1ruNcR>$mo*m!`]+OWAFTy""(5';a &fclW ǀtʫ)>\CY{b}[k& A^r݌ YNf3R@x'7sm6 '2m1q(ɲތ S5cQ1ڃ dRHf ".elpkGWq2TY15Qy5) DЯ9dSjdcԖ@Gbwh(^"|9҃)+MHnx۩|Kc_88X١ThDNY8!J-Cg[Mqi`pҏ [V+'!%=~ d"*3ׁ2Y30&bR%6sW"h [M(Q9)#!x lxzJO5ubr:ʜVb#ҘhQdnyWDY'GTܠ,p{֢3ݳе&,?A[LʅQn[$K%3жiy$ZlЗ(%,T_n@1#4eA)bHC0JsRjno2 YqiB}vHH`,2Vy7mjJۣJgi~>7C+dbDMi…4m /xY{X|)!u@YLPk0'j^(iӋt5d4hG$*pgSzYzCw ec\2n9; r46qQZbP7: dp@wXc9Q D@YjVM˶ DF gow\[o $;前p$TSruv+RGA%1G$!i)=, zl|gZRPb )JL@M% C)Zd_+:,D)"R[6 aA# BF]ԢS20qnaqC FhDoȁG(+ hq&l6cƸ▼q^LH[٤8C{69lM| DIky*nh 77O:ac(uWPK9Nkz=rẁtzey6lOMibVzah2YN+~|R4!1j-'f@,If!$g8@>-oZOmk_oOXhv7.| }fC-xcXFwPƲ^.v`n,bZJ ACbR Y$Tp_O NWjkm S~1<C*;bG_ysoli7n4!5$ TI[)zYSMAҬInjt{ht(y) P)㺻75Y©l`+8^ e?-MMwM9}v3/Ѹq ?lk\Gikltp&D&k>svNLbB:IöH^;J^p⵨a6iTHD,d8%kY۬tRMv_sCI1Eaܖ#ID衛EN?JWKX[o\wpNi RєNXI0=Z7a#b~w*^ ~^"w 7Ǔ p\^̞o܍N'{La_D O򡩏Yjyonן V=f$8\yܓuIt,>,Ib agpej7% *_XcKU=='a|SQ& ©W$470hzJߕW@J8t7x \UC6Z+AW2;D_ޡ˾򔙛vJTd&jZ)V$c~77t7UZ$ا|.xM}|KaW倭sv59\ ӼT%E!ZkjUMB-^UQ#޼꫚<ЬR50W=u Dlk퍶P׽ ʘstmM {NzZV"Y:z|UFPI[#g']6 S%s]ZRRjH$A$JIX6ힼMjJk@C!i]\Y 1ƣF.)!q\ _O 6M0opo dg6J 2 jR^󘣷IG$e"BuAQ] kƦ1޳JʝىZc{ځ$(Ũft'3 \sY PSpQe# J}`4jSƑXz yp=4Ү6_+a٫KB^U[]+X[Kz{3Yߨ.!W^%"H3.oP5G~㏻)E틫wڪ H%.qe(^JiZVW =kZ@&MX%]_^d\1 fwofP1Mw3Y2:yzWICtc1 eR`Aؤ-I/?B9@; )Tx*$)yr$S+d-bJ`\9u - zpY\d\Oo~3 1ޗ^']$;TK7%CZ"M1ְW2XuQz%?ɐnKVb7%jjtE#7[ Ƭ|V/^Ld%#" 4 ?n1G{e-Ůt2%Y \HgI"$sĀ7h-i evU쫪G,B2{rgqd7.#³@bP ![DƶkTwxg~%ӧJV+5rPB>fQ\zԝ uXxbۛE$.$9}Eс~9θ_j?۠V\vTjJcxfU-tqeQz"ɷi| ֖߂*D(qNgT~I}[#W38 >41ȗیrO :%ȥ똧 Iʱ FwEĞB^"܇h% 54uYx=H -3mqٲkD;~BidVZbƧRJca4#&HF"D|8άeA᷸_/ABZjI z,IְKhg$, f!5L}ٻ6r$Wz#q9nlt}͋'8%Z(eMIHQRQUh[h[—HdqW?М)G!є AN_nN̍V6' N? 5[ ǰ@6 -P{HwSB7F) ??#1( :kp0L8'~T&.k s-xD ~"OfuHߘ.bgDd9\U@A;3y8ht|b,ڢݳ"r"9NwW6+=mGDyi? x6t(uHQ$5 (#]8[pvtVw|U"O>\!N*YJVsB?,w>( AZ%<b“$wcYa3'oX>T%;4*FU]"FLB_LAk:*uW9&=ß޳uy~zZ O;~C pv' uGw*@2lh v 5ZKwbBu/%kb9#Tn2;D pq2A'ٓ 0/GOUdhXbQ*˽vu^5T4F 5p!"'Y*HS: NIy ?+}EA'zi1jAA˒A%bzӬx6;qxȊÊ 9x* *qFY8~\9Kavg9IbZ$2ďm ޿/,eYnHzy$+SIvkSʉl b7L<)'.敻n8bh6OuB<kxp 1=/mЈA& f.3e E֨F3 ^&=s13mZ/ }@h GWȝ DIWLSnS}1љ| 3U2j*JqEuYE?F irxn51dT $O;(z+&bPJ# UxO[#(l@i Hϕket54][pCM׎LVA7'1),m6zaI)y T0T`-nevK&1RjE+`,9909pp$ 91PSDFe)6o5t(~x|B:)RGm=9~?=9Bn}ITP.nԃl b.CԥeptUtox#)^H&Ag {>0l8e6]RE|ɮAr72FX[,SΎy qEɜR@hY%k׾VP-ʲ4 SRr` :Α\~0Gf^R5 =YU4Ҷ[n]-^+-t2uPb19Nh:`1r{ВNEbSh924 }RT2KW"KkmC剒Cs@}Ӽ!94h=>=8>OOwOd%w&Aŏ '?"RвklYI݊)"ݏȔCɒz^)|mXR`tAԬiVOF0V{p J*+wҮ 6_+隚yy< jτ%.kģG 3CE3r3>FZ&ϹceKMGd׃6bjhYj^oĜM=l- 0Q W6y${^ϖe[p*věg3~OSv?6j}v<\ қcPz s}\c*b(E]Eu$^JC? q_ k(_9(9(9(9b0649̨Ir)y:{;7\_*|8?W^AyZYXޭ\_9/S:{Sjqt}2EZfnWd[@)(WmHC_"Sd.ȌHPn )} Ct06˹T,Kp!c+#Tg, }㫣yŧ{b>.f9Kzia_|Tzj2'k߆9xKB�:@Â#o/㛾S]n/N_:+a$ΖiSw gL1[Ou)ֿ\X4(np^{&W^o$AK§XapuL#ݞ@lb;"#@Ė%WlVȗ+n,$ls' j/|&("*|]~j KKs[ ;d1J瘷0 @ qG]9;D蝹 mVS}r8ʢ C0yv5ycXY-%;B 1ÍԞhᅭ  KEis^^兠֮rd^_Вb[qK>4a;ȯ!$-ihW@î!aYfahZ>|֖ zc|-e=̮7b/~SY\WI[<;򽲷Ubu>:9miAedzN4a|1W-n]6ZӠdC8z33/^8M%*w8>֊ bYEȮV(bڜZ?]?J+hI7.dҽ7{Ҡ}6鼯S#M-Pu!!߸)E{7j[[4F=a hڭڭ EtM\Ϭn`>πUE^+sqOΏ⃽rvejT.=Xt9ku}T @iϳFl0Ʉ)9@ўh):y1sRU4 Oke77*$.޶bLXf:XJMwZ[jkk[[B0!W'BqkcpiI)Y7 [SP_~WBS/t0|#45X-Ir?-my~[u4!lz~Bo~ y?8d3EꈧB*f=Xn 6f$ ^ IL.Pl7{7q_z6z9 %FzoPC"`(##ZR A;PɝQOV'x`)Q;WoB KuO-^T&k֯gmϕ [HP\|ekxoTy 7F8i,U ="6fmz.IeٰR:^ː0\jG-0bN%iEwlAIx 4txr$$ PeJTzBX27/n6 tBS|8&ԏ]BHu-S] hܚLg5ZQCף"eQ|A1r.pJdu30e`~hQ/ pJL =߬8K go[6*W>קj# ѧTX▥!B?5]mr?pYCw10|d\Eņ@{g z;;_yW+v, _YqjC|<6^4txtvZx"XYꄴbIYp7Y K)a 'eJ7 T '6p1Q]iCprVoߝG),e{ULqZhkݲ}dW|~dK>+K)$~Pbbxd1)Lp%3:ǥ)TG䈟xbhmڄ,{_K{v@RXf4ϯ:kڦe^ƂOf1Xpa76ff(˺IVdmlku\Q9N i_h͍lۥ q엌^jT&-MlKOG RERnjXhA_To?{޵BK .e74_F[yo8(;s9?*m}sgM54Ӊ!iXrtT_৤r:S z8z|q؂d64['JPǨsHNؓ BYȔ.S;5ȬW2IЌ+dЖ[%[O֢#D95Jp~*;_\.};yCr|NgyĨfWH;Udž朘Ƿ.`_ ht2_,@R6wY$O4U%Zj-iGUOʤƄVnJO@y Bsb@Q:75@.X>{g>J>Nn֙T$Re2T$EG?!6Hmb7r<)AWthݖԖ3Z\t4Db j|4srθ3{.YmQо.CSbw6kh<ۦ&A㤂 ˣ,󏝉HNUʟ^bH5 մڇ(t#jokVQNz[݂Z.`)k \_"`B![JthsoqFnKŘoTR3 CLH y(Z[=]Hu\O/Xef :Jk9 '&J:kXvZ%s<2~Fu8 gDƳWi9wI JXft%R3A(hzA5Ӌa]V7J@3Kb{-2J!'2x$OB0_Ql)ԭӎ{%ϨtD EuYBj)B9ßTd SY[TQ^=jL?u2\[3< SڔgĻL0C2ibx%=s)BJc% *%y\AkgjbB 1 ՚/Mk@rm]pgoǯ6; "Rԏߎ+sG&{wpl{s7<,@ʷ ᓻsTCK&^|yYtDe{#4@q>4x?u8"8OV ^T*9[jR]4Y1vM 15PQRÑ:AQAxTCT*!ߧ}릕R'ӏN q-'Xʇ2``H\BMc.T G;i)*jmTfFe@S<ۮ)44[ڙt̵34pT ]dhUBSɔF *➣uhe=p&N=Zٶ" 0,TY1<4OT5`=i.Ƨ^ N7aQw 6݄NMV6ߤ$™,m^6+;Q ||+WLO_׏_xW!|>(ѫ FL$Ջo" Ɠofww7SܩRݿ2G#P^do% b:gi5#?UMC.*EquZuԽ1RZ V @5$-.Ŗ+8[ڼboj3Fdzh:h~_rC.;o>@d>:yP7^t ~~;7sЃ{U:Tv/;xK~ݴfcȮJԍz<М$[ p0L}߸!. F~ۍ:1e=sP_Bv癓>ԐRnbwjMB %C}ZEvkʶ%,Y?׌E$@ xL"_C,m{h,22%ƃdxuy`Sb@ۙ䨌]^6x4JX]? MI|Ov?J+XWsU\ty+f,gęyI2LK?UY%u tyBk_-]KC 02=_6?tnܭI%OrZS'`u(5 nq|o5b9HM 4™kG7Ψh4ZXCȖk؝2;;7lKCm@VرUyy:6j⪢%ߟnzo ȅ KA_UEv}'v]V&*gC5Wq^( /φ~R|>dz4fy*BA9xQ18\?@]~VXYޔ$!_6J5&Ǡݚ FtQG *Ժ:{EA5(S8QKJ=Q-%Q,+ i$XMA\Ed=`(Rq{KDV Pu[Ddu(j"z`JO*\p2}$y;D)ΆTL"S$]n8|WmI޻?؏WK=x%҉>UTɶ$-}rw "4 nz<"-i7-n(yRT_g>YJ$o9GUZtEEZy(z߼-v2>Y)Ba%P}(kp4Ab[4o-5g ]3>ՐCNx԰&\r7sAOU`J N*e9u˾{1Mka2oBAcNo)4-'Q\\^a C &}ymccMq<FRH2VZ0>?ʻ^ ^t(fIspSh9\"[9i9mtR=X>Ǿl ؗZw"wQ[<2Ig[4!Di p6j=pR ̷Þaj7F*dzk]!J/0dm7$4Ѵ]-C}ǵԖ;ķ++HT\N?121[w}$&HM웪d_ƃaBh3+F)Ť,˽{ &Ħ*rmY$q4+`j޾0d-jUo)e%0uw~5 n5^|E:hӾS+1:k5SeNXLjxJ!ry)B/sruI`h#(0)؝PJ®5nkh >˜̳i@O2Mw; H D;}$礻s>}!}TJ,;lDlkԼQG5j֮FK? z\ׇnw݇|z F+}qT3QWGwb[ٸٷsl+lgLJ gջU614k^NcTcc K+Xhc949;D0I{$1_ @-vj 3B:'V.a,i™V9I9ln+cްrRDl硾^ZTHՕW])UR ~qHiܤ9ԚLr<:ڃn#wJ$ 6KN /.+'"n]kј[G>C} o; O2/*=KnN<*H5Aqr}ߺb)1:ż|zg9QǦ`b^4:fHH>֏&(41vM9GEs&(o$wA -` D_T'eGJ&&.q+O$k\|i6+j␧E~%(G 9gEFBR 0&yvh AhRh銜ΕL -p)~JK%PRk2G0Xg* œz$PF&RzwYDJxuq+wLr˰BlvZ v5!OS.|ѧ _L($Ij=rw Zd4IZp/g)ԙIFM+;)œk{ǝ]"OI՚no%u4éEw]|`0ɖ51D_"aJuFp glh)8zoZx--{슳F_YOϯ,"+,jp hbޕE jȎĽ*ݻo9)"FRR~ cЙ|+N+AUdSrhDH(4Z+}GM 𣗙U\OW]'zQ4E`@<7YzP }_1~[\g~g7ezhP6Wa^"MXd\g޵+`w\rx}tł &]%;| %ʒݸA"7_ŤS$T>iCp{XU_m<깩/K[?or~"䙇hVLIr9y3Iǐr1H gncR q[Ct-tbLjnlaF 1}ۙQ,pBHw,KbbF g sTRFRX=iERN*a`t)NUiceKR8kGTSz`q,C[[VJ׺J`p֍V3*qWlu]M}Qu5!g5Ro]oPH{i-WA N!t!W FٯhծO>)7q4b;O*H%bO< Q\y!~C{h,y7[П-,͸y#f?8>[qFdgNl݄IQF-?woNdS]* it!FUcJQ*)(( \i=fNw dBALyVMA쩓>tP]ϯg4}8EAotfva% (  }[x㷜.:Eby7 9;RbjmY-XDŜ5h=2+]MAkKJ ',; $rGKK{-G$AJ6@"zTsCE"p !bFXM` s^q NZv30j/[^G\o_u>*d~Y~f&K4M hc -,Gꐇ`I ](=A媰Zg5 3KN?ed<B>"eJjtAp$<($"~-P4p*b@_J|rbR?d_OM/H~)+Ļ/jߎ/հ7wz TWp痢2* [߯?Td_] aEsm'\˘Q!27[ȭ4vR\<~ë4%Ҵ֨r[Z܏M3^%+([bmWհmgD-ZVⳙwoowM1`륮H >Q̏P 1y*8 c՘2ՂJG˦pyQ;ذ겏d+e^77w_nV&,gznLtk܀]Nvf$X"Mi8;]jQGQʂ2ogЍA.tܭs tYsAU:{ȌILRdFcOm?3e Z i>q{Nٮ+j]iR+]j]E|݌^[ybu;}~^\7`7*N(d ^\k W :({ `98񅡠[wvZDkP]i aߋ-!lHGՐ*+!jȓmBw ^ln5#{< EMn5rڰZn@5X`.)Jlh#8bNN?b}D3Ѭ;(Ps1H gn#3(MnG<͊87[DG햇AZǘvx XI#䙇ZL%ļ(Y9mys =D Eण?`M쒨$Ď[Oц@*sĕDDO,Y'r8e]5= j>6L>mjHd *ǣR*7QzzQ_\Z]7V=)^߆yawMi iG1i x~k6E%T셾>\ɁIñ\T[ b`w1>Sr-v*)N z59uU UMP2m :oL?/^WY4yNr5GweŨ*M|rffbYqĐ]Swwycv',] )\-`-/e*=j_ JrC#KaeAx,tY8 Dp)Xx^lHyVMC 0P)b}GZhgt Xw6K=ʂ%ʮMnt<[k8GYJ&J'PS7^xJ@"0Q#4D'%)3K@O@ A9֗ b)aBfx#Tb`ч*Q;8 @pv\*8z-ctI}i;JF zMAL4z0-5.&]itaF/Fyc8sWVƒy$ңħ2:l4 ZvỨzہiƘ&M^#~u5{|g<4N6l=O$6EĢO]]v=a/Kk2`YMILPeLV{<;lhoZ㩜P-GQjjs9tKd5CPVb*A+EKeIEd%XAAľ*.c$fy&8U1u Si>S/I x hǭXb,̴uN6h# :CH 5a"KRy@`JX/yﲺv%var/home/core/zuul-output/logs/kubelet.log0000644000000000000000006601652315156505305017713 0ustar rootrootMar 18 09:02:12 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 09:02:12 crc restorecon[4762]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 09:02:12 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 09:02:13 crc restorecon[4762]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 09:02:13 crc kubenswrapper[4778]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.891437 4778 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896321 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896351 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896361 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896371 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896380 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896391 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896404 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896415 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896424 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896432 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896453 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896462 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896470 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896478 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896485 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896493 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896501 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896508 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896516 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896523 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896531 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896539 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896546 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896553 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896561 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896568 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896576 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896583 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896591 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896598 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896606 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896613 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896621 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896628 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896636 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896644 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896652 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896660 4778 feature_gate.go:330] unrecognized feature gate: Example Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896669 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896676 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896684 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896691 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896699 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896706 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896716 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896726 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896736 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896744 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896755 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896763 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896771 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896779 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896787 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896794 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896801 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896809 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896817 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896827 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896837 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896846 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896854 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896862 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896870 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896878 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896887 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896895 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896902 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896913 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896922 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896930 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.896937 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.898936 4778 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.899038 4778 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901133 4778 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901247 4778 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901263 4778 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901274 4778 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901289 4778 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901303 4778 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901313 4778 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901322 4778 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901340 4778 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901351 4778 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901361 4778 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901370 4778 flags.go:64] FLAG: --cgroup-root="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901379 4778 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901389 4778 flags.go:64] FLAG: --client-ca-file="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901419 4778 flags.go:64] FLAG: --cloud-config="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901429 4778 flags.go:64] FLAG: --cloud-provider="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901438 4778 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901458 4778 flags.go:64] FLAG: --cluster-domain="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901468 4778 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901478 4778 flags.go:64] FLAG: --config-dir="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901487 4778 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901497 4778 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901516 4778 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901526 4778 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901535 4778 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901545 4778 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901554 4778 flags.go:64] FLAG: --contention-profiling="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.901589 4778 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902415 4778 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902441 4778 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902455 4778 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902488 4778 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902502 4778 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902515 4778 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902526 4778 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902539 4778 flags.go:64] FLAG: --enable-server="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902550 4778 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902568 4778 flags.go:64] FLAG: --event-burst="100" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902581 4778 flags.go:64] FLAG: --event-qps="50" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902593 4778 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902605 4778 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902617 4778 flags.go:64] FLAG: --eviction-hard="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902632 4778 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902644 4778 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902654 4778 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902666 4778 flags.go:64] FLAG: --eviction-soft="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902677 4778 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902689 4778 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902701 4778 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902713 4778 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902724 4778 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902736 4778 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902747 4778 flags.go:64] FLAG: --feature-gates="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902762 4778 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902774 4778 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902786 4778 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902813 4778 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902826 4778 flags.go:64] FLAG: --healthz-port="10248" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902838 4778 flags.go:64] FLAG: --help="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902850 4778 flags.go:64] FLAG: --hostname-override="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902861 4778 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902873 4778 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902886 4778 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902900 4778 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902911 4778 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902923 4778 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902935 4778 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902945 4778 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902957 4778 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902968 4778 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902981 4778 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.902993 4778 flags.go:64] FLAG: --kube-reserved="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903005 4778 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903016 4778 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903028 4778 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903039 4778 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903052 4778 flags.go:64] FLAG: --lock-file="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903063 4778 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903075 4778 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903087 4778 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903109 4778 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903120 4778 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903132 4778 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903143 4778 flags.go:64] FLAG: --logging-format="text" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903155 4778 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903168 4778 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903179 4778 flags.go:64] FLAG: --manifest-url="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903190 4778 flags.go:64] FLAG: --manifest-url-header="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903245 4778 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903258 4778 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903273 4778 flags.go:64] FLAG: --max-pods="110" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903285 4778 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903297 4778 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903308 4778 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903321 4778 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903336 4778 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903347 4778 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903361 4778 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903393 4778 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903405 4778 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903417 4778 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903428 4778 flags.go:64] FLAG: --pod-cidr="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903439 4778 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903455 4778 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903467 4778 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903479 4778 flags.go:64] FLAG: --pods-per-core="0" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903491 4778 flags.go:64] FLAG: --port="10250" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903502 4778 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903514 4778 flags.go:64] FLAG: --provider-id="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903526 4778 flags.go:64] FLAG: --qos-reserved="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903538 4778 flags.go:64] FLAG: --read-only-port="10255" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903549 4778 flags.go:64] FLAG: --register-node="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903560 4778 flags.go:64] FLAG: --register-schedulable="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903571 4778 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903594 4778 flags.go:64] FLAG: --registry-burst="10" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903604 4778 flags.go:64] FLAG: --registry-qps="5" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903616 4778 flags.go:64] FLAG: --reserved-cpus="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903627 4778 flags.go:64] FLAG: --reserved-memory="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903641 4778 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903654 4778 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903666 4778 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903676 4778 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903687 4778 flags.go:64] FLAG: --runonce="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903697 4778 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903708 4778 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903721 4778 flags.go:64] FLAG: --seccomp-default="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903732 4778 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903745 4778 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903757 4778 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903770 4778 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903782 4778 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903793 4778 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903804 4778 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903816 4778 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903827 4778 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903838 4778 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903850 4778 flags.go:64] FLAG: --system-cgroups="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903861 4778 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903881 4778 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903893 4778 flags.go:64] FLAG: --tls-cert-file="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903904 4778 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903919 4778 flags.go:64] FLAG: --tls-min-version="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903930 4778 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903942 4778 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903952 4778 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903963 4778 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.903975 4778 flags.go:64] FLAG: --v="2" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.904006 4778 flags.go:64] FLAG: --version="false" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.904021 4778 flags.go:64] FLAG: --vmodule="" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.904036 4778 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.904048 4778 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904386 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904401 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904415 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904427 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904441 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904457 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904468 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904478 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904489 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904499 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904513 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904526 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904538 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904552 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904564 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904574 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904588 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904598 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904608 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904618 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904628 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904640 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904650 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904660 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904670 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904680 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904690 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904699 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904709 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904718 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904728 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904738 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904748 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904757 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904767 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904776 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904786 4778 feature_gate.go:330] unrecognized feature gate: Example Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904799 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904809 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904819 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904830 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904840 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904850 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904859 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904869 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904878 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904888 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904898 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904907 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904920 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904930 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904939 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904949 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904959 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904968 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904978 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.904992 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905004 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905015 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905025 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905034 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905044 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905053 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905062 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905072 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905081 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905090 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905099 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905109 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905118 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.905127 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.906253 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.920355 4778 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.920412 4778 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920569 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920589 4778 feature_gate.go:330] unrecognized feature gate: Example Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920602 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920614 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920625 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920636 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920647 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920659 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920671 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920682 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920697 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920713 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920724 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920735 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920746 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920756 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920768 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920778 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920787 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920798 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920808 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920817 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920828 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920839 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920848 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920857 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920868 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920884 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920898 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920908 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920919 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920930 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920940 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920951 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920967 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920977 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920987 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.920999 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921009 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921019 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921029 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921043 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921056 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921070 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921084 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921122 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921134 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921145 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921156 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921167 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921178 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921250 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921262 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921273 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921283 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921293 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921303 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921313 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921323 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921333 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921343 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921353 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921366 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921378 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921389 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921399 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921408 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921417 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921428 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921439 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921450 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.921468 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921747 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921769 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921781 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921792 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921805 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921815 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921829 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921841 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921851 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921865 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921878 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921888 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921898 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921908 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921920 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921932 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921942 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921953 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921964 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921975 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921988 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.921999 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922010 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922020 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922032 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922042 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922053 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922063 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922073 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922083 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922094 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922104 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922114 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922124 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922137 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922147 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922156 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922166 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922177 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922187 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922232 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922243 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922253 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922265 4778 feature_gate.go:330] unrecognized feature gate: Example Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922274 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922284 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922295 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922304 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922314 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922324 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922333 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922343 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922353 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922363 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922373 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922383 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922392 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922402 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922411 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922423 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922433 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922446 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922456 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922466 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922477 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922486 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922496 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922510 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922522 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922534 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 09:02:13 crc kubenswrapper[4778]: W0318 09:02:13.922546 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.922564 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.924527 4778 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 09:02:13 crc kubenswrapper[4778]: E0318 09:02:13.931276 4778 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.936030 4778 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.936249 4778 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.938928 4778 server.go:997] "Starting client certificate rotation" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.938968 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.939246 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.970531 4778 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 09:02:13 crc kubenswrapper[4778]: I0318 09:02:13.973093 4778 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 09:02:13 crc kubenswrapper[4778]: E0318 09:02:13.973904 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.000416 4778 log.go:25] "Validated CRI v1 runtime API" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.045923 4778 log.go:25] "Validated CRI v1 image API" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.048397 4778 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.053339 4778 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-08-57-09-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.053402 4778 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.066554 4778 manager.go:217] Machine: {Timestamp:2026-03-18 09:02:14.064518792 +0000 UTC m=+0.639263652 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4e5f6a1b-325c-4eb3-9961-e93f55b97b93 BootID:09c4ac70-7aed-4b4e-97f0-04cc523320b9 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a8:27:26 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a8:27:26 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:69:0d:ac Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:15:6e:16 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a5:13:66 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:70:db:86 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:2a:87:12 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5e:42:a9:33:64:59 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:86:65:56:17:3c:1f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.066750 4778 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.066846 4778 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.067471 4778 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.067715 4778 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.067768 4778 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.068871 4778 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.068898 4778 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.069518 4778 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.069554 4778 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.069812 4778 state_mem.go:36] "Initialized new in-memory state store" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.069923 4778 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.073456 4778 kubelet.go:418] "Attempting to sync node with API server" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.073486 4778 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.073516 4778 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.073533 4778 kubelet.go:324] "Adding apiserver pod source" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.073545 4778 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.078441 4778 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.079411 4778 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.079964 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.080105 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.079983 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.080256 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.082072 4778 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084387 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084426 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084439 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084450 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084466 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084476 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084488 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084505 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084517 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084531 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084560 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.084569 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.085649 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.087177 4778 server.go:1280] "Started kubelet" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.087241 4778 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.087361 4778 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.088319 4778 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 09:02:14 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.100540 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.102194 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.102295 4778 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.103031 4778 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.103009 4778 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.103181 4778 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.103104 4778 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.104416 4778 factory.go:55] Registering systemd factory Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.104526 4778 factory.go:221] Registration of the systemd container factory successfully Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.104607 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="200ms" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.104965 4778 server.go:460] "Adding debug handlers to kubelet server" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.109702 4778 factory.go:153] Registering CRI-O factory Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.109742 4778 factory.go:221] Registration of the crio container factory successfully Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.109840 4778 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.110253 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.110410 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.109918 4778 factory.go:103] Registering Raw factory Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.110759 4778 manager.go:1196] Started watching for new ooms in manager Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.111716 4778 manager.go:319] Starting recovery of all containers Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.111062 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.70:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123020 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123080 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123096 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123109 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123122 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123136 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123162 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123175 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123218 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123234 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123247 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123261 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123276 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123292 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123303 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123316 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123336 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123352 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123400 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123411 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123428 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123439 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123451 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123481 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123506 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123518 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123530 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123542 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123553 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123565 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123575 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123587 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123788 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123804 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123818 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123835 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123851 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123865 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123879 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123893 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123909 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123921 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123935 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123947 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123965 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123980 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.123994 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124006 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124023 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124034 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124046 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124056 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124072 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124086 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124100 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124114 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124127 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124140 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124150 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124161 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124176 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124190 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124229 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.124241 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125261 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125286 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125301 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125314 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125327 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125338 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125351 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125363 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125374 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125385 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125399 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125410 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125425 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125461 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125479 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125492 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125504 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125519 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125532 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125543 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125555 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125567 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125578 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125593 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125605 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125618 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125632 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125643 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125655 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125666 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125679 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125695 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125710 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125722 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125735 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125800 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125814 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125827 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125842 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125853 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125875 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125891 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125904 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125919 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125932 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125947 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125961 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125976 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.125989 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126002 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126013 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126031 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126043 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126057 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126069 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126080 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126093 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126105 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126117 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126130 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126142 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126157 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126169 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126182 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126211 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126259 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126275 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126287 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126301 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126315 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126325 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126335 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126346 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126357 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126367 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126376 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126386 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126398 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126408 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126420 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126430 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126442 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126453 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126466 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126476 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126487 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126498 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126508 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126517 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126530 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126541 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126552 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126563 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126573 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126585 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126596 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126607 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126618 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126635 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126647 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126659 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126672 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126683 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126693 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126706 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126718 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126728 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126740 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126751 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126763 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126774 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126785 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126796 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126808 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126819 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126830 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126840 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126851 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126861 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126871 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126884 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126895 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126905 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126917 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126927 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126939 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.126952 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129369 4778 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129402 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129415 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129426 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129463 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129473 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129483 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129493 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129505 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129540 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129551 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129561 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129572 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129582 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129592 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129605 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129616 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129627 4778 reconstruct.go:97] "Volume reconstruction finished" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.129635 4778 reconciler.go:26] "Reconciler: start to sync state" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.148133 4778 manager.go:324] Recovery completed Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.159459 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161212 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161878 4778 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161898 4778 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.161919 4778 state_mem.go:36] "Initialized new in-memory state store" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.176964 4778 policy_none.go:49] "None policy: Start" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.179666 4778 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.179720 4778 state_mem.go:35] "Initializing new in-memory state store" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.182459 4778 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.185796 4778 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.185838 4778 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.185875 4778 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.185928 4778 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.187681 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.187742 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.203478 4778 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.255692 4778 manager.go:334] "Starting Device Plugin manager" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.255755 4778 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.255772 4778 server.go:79] "Starting device plugin registration server" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.256230 4778 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.256244 4778 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.256556 4778 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.256633 4778 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.256640 4778 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.265169 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.286347 4778 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.286481 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.287929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.287966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.287977 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.288121 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.288429 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.288508 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289080 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289173 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289384 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289475 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289842 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.289857 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290425 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290635 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.290698 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.291110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.291127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.291136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.293766 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.293876 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.293897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294354 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294376 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294610 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294868 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.294944 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.296481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.296546 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.296568 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.296883 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.296955 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.297593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.297662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.297678 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.298334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.298377 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.298391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.305775 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="400ms" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331417 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331505 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331535 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331612 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331713 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331808 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331858 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331888 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331918 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331948 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331974 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.331997 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.332019 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.332041 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.359004 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.360816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.360893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.360910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.360945 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.361922 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.70:6443: connect: connection refused" node="crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.433807 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434107 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434136 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434237 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434442 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434470 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434493 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434524 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434546 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434540 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434599 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434585 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434564 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434613 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434643 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434691 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434723 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434762 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434799 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434791 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434873 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434894 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434880 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434849 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.434980 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.435009 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.435039 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.435135 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.435334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.562567 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.564272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.564529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.564680 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.564873 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.565627 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.70:6443: connect: connection refused" node="crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.631616 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.640576 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.660988 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.669579 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.674814 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.688648 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-523f305bbaa14bef6a11c883b99104dd5f8bbdf21ecb2794d77cbbc112602d05 WatchSource:0}: Error finding container 523f305bbaa14bef6a11c883b99104dd5f8bbdf21ecb2794d77cbbc112602d05: Status 404 returned error can't find the container with id 523f305bbaa14bef6a11c883b99104dd5f8bbdf21ecb2794d77cbbc112602d05 Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.691287 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f8eee9aad49d4f5c64cf7769bb23d963c1d47bc628b49c4c2e9cc2ed6f11d9ec WatchSource:0}: Error finding container f8eee9aad49d4f5c64cf7769bb23d963c1d47bc628b49c4c2e9cc2ed6f11d9ec: Status 404 returned error can't find the container with id f8eee9aad49d4f5c64cf7769bb23d963c1d47bc628b49c4c2e9cc2ed6f11d9ec Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.699717 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-9634940ffdccff7e80603d683b6400ede86b40098940697f2f52d7cde2168c16 WatchSource:0}: Error finding container 9634940ffdccff7e80603d683b6400ede86b40098940697f2f52d7cde2168c16: Status 404 returned error can't find the container with id 9634940ffdccff7e80603d683b6400ede86b40098940697f2f52d7cde2168c16 Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.705920 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a3fb0861f8792b15e1914aacdcd3749032aeeb584bff765731cfbe226fcd9752 WatchSource:0}: Error finding container a3fb0861f8792b15e1914aacdcd3749032aeeb584bff765731cfbe226fcd9752: Status 404 returned error can't find the container with id a3fb0861f8792b15e1914aacdcd3749032aeeb584bff765731cfbe226fcd9752 Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.706359 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="800ms" Mar 18 09:02:14 crc kubenswrapper[4778]: W0318 09:02:14.707781 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-47ed5e6cd5c13985cfc22ee9570b8fb0e80a38e0d6f4ce510bf9458b6aadcc5a WatchSource:0}: Error finding container 47ed5e6cd5c13985cfc22ee9570b8fb0e80a38e0d6f4ce510bf9458b6aadcc5a: Status 404 returned error can't find the container with id 47ed5e6cd5c13985cfc22ee9570b8fb0e80a38e0d6f4ce510bf9458b6aadcc5a Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.966105 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.967407 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.967444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.967456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:14 crc kubenswrapper[4778]: I0318 09:02:14.967491 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:14 crc kubenswrapper[4778]: E0318 09:02:14.967928 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.70:6443: connect: connection refused" node="crc" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.102280 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.190458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a3fb0861f8792b15e1914aacdcd3749032aeeb584bff765731cfbe226fcd9752"} Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.191463 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9634940ffdccff7e80603d683b6400ede86b40098940697f2f52d7cde2168c16"} Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.192649 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"523f305bbaa14bef6a11c883b99104dd5f8bbdf21ecb2794d77cbbc112602d05"} Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.193536 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f8eee9aad49d4f5c64cf7769bb23d963c1d47bc628b49c4c2e9cc2ed6f11d9ec"} Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.194309 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"47ed5e6cd5c13985cfc22ee9570b8fb0e80a38e0d6f4ce510bf9458b6aadcc5a"} Mar 18 09:02:15 crc kubenswrapper[4778]: W0318 09:02:15.267647 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.268072 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:15 crc kubenswrapper[4778]: W0318 09:02:15.270024 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.270120 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.507018 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="1.6s" Mar 18 09:02:15 crc kubenswrapper[4778]: W0318 09:02:15.565176 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.565320 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:15 crc kubenswrapper[4778]: W0318 09:02:15.573759 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.573839 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.768458 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.770551 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.770599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.770681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:15 crc kubenswrapper[4778]: I0318 09:02:15.770751 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:15 crc kubenswrapper[4778]: E0318 09:02:15.771415 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.70:6443: connect: connection refused" node="crc" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.102788 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.106266 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:02:16 crc kubenswrapper[4778]: E0318 09:02:16.107517 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.200787 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5" exitCode=0 Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.200953 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.200971 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.202492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.202670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.202871 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.203482 4778 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce" exitCode=0 Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.203612 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.203665 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.204854 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.204912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.204934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.206698 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4" exitCode=0 Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.206757 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.206844 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.207973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.207996 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.208006 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.210820 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88" exitCode=0 Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.210883 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.210946 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.211927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.211984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.212012 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.214138 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.214978 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215023 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215045 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215064 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54"} Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215135 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215299 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215321 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.215330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.216925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.216963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.216979 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:16 crc kubenswrapper[4778]: I0318 09:02:16.790734 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.102346 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:17 crc kubenswrapper[4778]: E0318 09:02:17.107978 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="3.2s" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.227398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.227464 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.227475 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.227514 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.229079 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.229112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.229124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.230257 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.230283 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.231434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.231462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.231473 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.235171 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.235254 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.235267 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.235277 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.237885 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988" exitCode=0 Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.238004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988"} Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.238017 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.238088 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242744 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242794 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.242753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.372075 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.373637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.373693 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.373705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:17 crc kubenswrapper[4778]: I0318 09:02:17.373743 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:17 crc kubenswrapper[4778]: E0318 09:02:17.374330 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.70:6443: connect: connection refused" node="crc" Mar 18 09:02:17 crc kubenswrapper[4778]: W0318 09:02:17.571300 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:17 crc kubenswrapper[4778]: E0318 09:02:17.571398 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:17 crc kubenswrapper[4778]: W0318 09:02:17.740580 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.70:6443: connect: connection refused Mar 18 09:02:17 crc kubenswrapper[4778]: E0318 09:02:17.740680 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.70:6443: connect: connection refused" logger="UnhandledError" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.133527 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.243081 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1" exitCode=0 Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.243180 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1"} Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.243221 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.244515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.244550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.244558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248614 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"afe66da1be5d0bac4307a8d0d57524c3eeac9e7cfc3a44cfd27bfa2874fca59c"} Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248693 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248717 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248747 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248744 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.248834 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249780 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.249988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.250870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.250888 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.250897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.251172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.251233 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:18 crc kubenswrapper[4778]: I0318 09:02:18.251251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261216 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb"} Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261296 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2"} Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261320 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644"} Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261338 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72"} Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261339 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261510 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.261604 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263457 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263623 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.263704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.492731 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.800737 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.809735 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:19 crc kubenswrapper[4778]: I0318 09:02:19.919839 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.270118 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.270799 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b"} Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.270874 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.271016 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.271382 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.271434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.271448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.271994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.272077 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.272105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.272509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.272564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.272583 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.429065 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.575130 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.576721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.576777 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.576804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:20 crc kubenswrapper[4778]: I0318 09:02:20.576856 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.275511 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.275640 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.275800 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277586 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277777 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.277798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:21 crc kubenswrapper[4778]: I0318 09:02:21.284759 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.100107 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.282559 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.282636 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284749 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:22 crc kubenswrapper[4778]: I0318 09:02:22.284785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:24 crc kubenswrapper[4778]: E0318 09:02:24.265343 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:02:24 crc kubenswrapper[4778]: I0318 09:02:24.451126 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:02:24 crc kubenswrapper[4778]: I0318 09:02:24.451509 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:24 crc kubenswrapper[4778]: I0318 09:02:24.453509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:24 crc kubenswrapper[4778]: I0318 09:02:24.453568 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:24 crc kubenswrapper[4778]: I0318 09:02:24.453586 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:25 crc kubenswrapper[4778]: I0318 09:02:25.101529 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:02:25 crc kubenswrapper[4778]: I0318 09:02:25.101651 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.103056 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 18 09:02:28 crc kubenswrapper[4778]: W0318 09:02:28.116119 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.116298 4778 trace.go:236] Trace[1700483078]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 09:02:18.114) (total time: 10001ms): Mar 18 09:02:28 crc kubenswrapper[4778]: Trace[1700483078]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:02:28.116) Mar 18 09:02:28 crc kubenswrapper[4778]: Trace[1700483078]: [10.001311468s] [10.001311468s] END Mar 18 09:02:28 crc kubenswrapper[4778]: E0318 09:02:28.116335 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.127908 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38612->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.128030 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38612->192.168.126.11:17697: read: connection reset by peer" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.144927 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.145760 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.148958 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.149009 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.149026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.301414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.305545 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="afe66da1be5d0bac4307a8d0d57524c3eeac9e7cfc3a44cfd27bfa2874fca59c" exitCode=255 Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.305629 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"afe66da1be5d0bac4307a8d0d57524c3eeac9e7cfc3a44cfd27bfa2874fca59c"} Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.305958 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.307603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.307662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.307690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.308776 4778 scope.go:117] "RemoveContainer" containerID="afe66da1be5d0bac4307a8d0d57524c3eeac9e7cfc3a44cfd27bfa2874fca59c" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.557346 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.557603 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.562683 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.562718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.562732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:28 crc kubenswrapper[4778]: W0318 09:02:28.606673 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.606793 4778 trace.go:236] Trace[1267424385]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 09:02:18.604) (total time: 10002ms): Mar 18 09:02:28 crc kubenswrapper[4778]: Trace[1267424385]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (09:02:28.606) Mar 18 09:02:28 crc kubenswrapper[4778]: Trace[1267424385]: [10.002259931s] [10.002259931s] END Mar 18 09:02:28 crc kubenswrapper[4778]: E0318 09:02:28.606828 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 09:02:28 crc kubenswrapper[4778]: I0318 09:02:28.610672 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.315753 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.320005 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263"} Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.320300 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.320377 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.321795 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.321843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.321855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.322108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.322312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.322481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.346231 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.711536 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.712410 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.720961 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.721108 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.725479 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.725691 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.729369 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:02:29 crc kubenswrapper[4778]: W0318 09:02:29.731731 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.731813 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.747362 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.747471 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 09:02:29 crc kubenswrapper[4778]: W0318 09:02:29.750046 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z Mar 18 09:02:29 crc kubenswrapper[4778]: E0318 09:02:29.750152 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.930030 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]log ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]etcd ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/priority-and-fairness-filter ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-apiextensions-informers ok Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-system-namespaces-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/bootstrap-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/start-kube-aggregator-informers ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 18 09:02:29 crc kubenswrapper[4778]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]autoregister-completion ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/apiservice-openapi-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 18 09:02:29 crc kubenswrapper[4778]: livez check failed Mar 18 09:02:29 crc kubenswrapper[4778]: I0318 09:02:29.930114 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.104434 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:30Z is after 2026-02-23T05:33:13Z Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.325444 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.327216 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.329700 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" exitCode=255 Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.329841 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.329804 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263"} Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.329938 4778 scope.go:117] "RemoveContainer" containerID="afe66da1be5d0bac4307a8d0d57524c3eeac9e7cfc3a44cfd27bfa2874fca59c" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.330235 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.330785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.330830 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.330846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.331591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.331623 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.331633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:30 crc kubenswrapper[4778]: I0318 09:02:30.332180 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:30 crc kubenswrapper[4778]: E0318 09:02:30.332356 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:31 crc kubenswrapper[4778]: I0318 09:02:31.104065 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:31Z is after 2026-02-23T05:33:13Z Mar 18 09:02:31 crc kubenswrapper[4778]: I0318 09:02:31.335551 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 09:02:32 crc kubenswrapper[4778]: I0318 09:02:32.106177 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:32Z is after 2026-02-23T05:33:13Z Mar 18 09:02:32 crc kubenswrapper[4778]: W0318 09:02:32.720357 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:32Z is after 2026-02-23T05:33:13Z Mar 18 09:02:32 crc kubenswrapper[4778]: E0318 09:02:32.720504 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.107704 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:33Z is after 2026-02-23T05:33:13Z Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.425988 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.426307 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.428032 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.428104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.428122 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:33 crc kubenswrapper[4778]: I0318 09:02:33.429046 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:33 crc kubenswrapper[4778]: E0318 09:02:33.429314 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.105691 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:34Z is after 2026-02-23T05:33:13Z Mar 18 09:02:34 crc kubenswrapper[4778]: W0318 09:02:34.201310 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:34Z is after 2026-02-23T05:33:13Z Mar 18 09:02:34 crc kubenswrapper[4778]: E0318 09:02:34.201429 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:34 crc kubenswrapper[4778]: E0318 09:02:34.266104 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.928900 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.929106 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.930587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.930620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.930631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.931127 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:34 crc kubenswrapper[4778]: E0318 09:02:34.931485 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:34 crc kubenswrapper[4778]: I0318 09:02:34.941453 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.101463 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.101565 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.105245 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:35Z is after 2026-02-23T05:33:13Z Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.351104 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.352610 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.352671 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.352693 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:35 crc kubenswrapper[4778]: I0318 09:02:35.353626 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:35 crc kubenswrapper[4778]: E0318 09:02:35.353919 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.105554 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:36Z is after 2026-02-23T05:33:13Z Mar 18 09:02:36 crc kubenswrapper[4778]: E0318 09:02:36.116973 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:36Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.121134 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.123151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.123230 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.123248 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:36 crc kubenswrapper[4778]: I0318 09:02:36.123291 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:36 crc kubenswrapper[4778]: E0318 09:02:36.128317 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:36Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.105361 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:37Z is after 2026-02-23T05:33:13Z Mar 18 09:02:37 crc kubenswrapper[4778]: W0318 09:02:37.392006 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:37Z is after 2026-02-23T05:33:13Z Mar 18 09:02:37 crc kubenswrapper[4778]: E0318 09:02:37.392117 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.908343 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.908572 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.910095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.910183 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.910255 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:37 crc kubenswrapper[4778]: I0318 09:02:37.911106 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:37 crc kubenswrapper[4778]: E0318 09:02:37.911505 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:38 crc kubenswrapper[4778]: I0318 09:02:38.095677 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:02:38 crc kubenswrapper[4778]: E0318 09:02:38.101878 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:38 crc kubenswrapper[4778]: I0318 09:02:38.106463 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:38Z is after 2026-02-23T05:33:13Z Mar 18 09:02:38 crc kubenswrapper[4778]: W0318 09:02:38.599840 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:38Z is after 2026-02-23T05:33:13Z Mar 18 09:02:38 crc kubenswrapper[4778]: E0318 09:02:38.599951 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:39 crc kubenswrapper[4778]: I0318 09:02:39.104288 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:39Z is after 2026-02-23T05:33:13Z Mar 18 09:02:39 crc kubenswrapper[4778]: E0318 09:02:39.734577 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:39Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:02:40 crc kubenswrapper[4778]: I0318 09:02:40.106764 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:40Z is after 2026-02-23T05:33:13Z Mar 18 09:02:41 crc kubenswrapper[4778]: I0318 09:02:41.106374 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:41Z is after 2026-02-23T05:33:13Z Mar 18 09:02:42 crc kubenswrapper[4778]: I0318 09:02:42.106400 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:42Z is after 2026-02-23T05:33:13Z Mar 18 09:02:42 crc kubenswrapper[4778]: W0318 09:02:42.846312 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:42Z is after 2026-02-23T05:33:13Z Mar 18 09:02:42 crc kubenswrapper[4778]: E0318 09:02:42.846444 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.104958 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:43Z is after 2026-02-23T05:33:13Z Mar 18 09:02:43 crc kubenswrapper[4778]: E0318 09:02:43.120168 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:43Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.129299 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.131228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.131298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.131313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:43 crc kubenswrapper[4778]: I0318 09:02:43.131358 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:43 crc kubenswrapper[4778]: E0318 09:02:43.134618 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:43Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:02:44 crc kubenswrapper[4778]: I0318 09:02:44.107989 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:44Z is after 2026-02-23T05:33:13Z Mar 18 09:02:44 crc kubenswrapper[4778]: E0318 09:02:44.266338 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:02:44 crc kubenswrapper[4778]: W0318 09:02:44.789109 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:44Z is after 2026-02-23T05:33:13Z Mar 18 09:02:44 crc kubenswrapper[4778]: E0318 09:02:44.789222 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.100847 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.100937 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.101013 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.101242 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.102805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.102899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.102918 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.103773 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.104094 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e" gracePeriod=30 Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.106746 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:45Z is after 2026-02-23T05:33:13Z Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.385332 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.385959 4778 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e" exitCode=255 Mar 18 09:02:45 crc kubenswrapper[4778]: I0318 09:02:45.386018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e"} Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.106669 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:46Z is after 2026-02-23T05:33:13Z Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.394665 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.395422 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5"} Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.395597 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.396829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.396884 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.396942 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:46 crc kubenswrapper[4778]: I0318 09:02:46.791304 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:47 crc kubenswrapper[4778]: I0318 09:02:47.107635 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:47Z is after 2026-02-23T05:33:13Z Mar 18 09:02:47 crc kubenswrapper[4778]: I0318 09:02:47.398903 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:47 crc kubenswrapper[4778]: I0318 09:02:47.406423 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:47 crc kubenswrapper[4778]: I0318 09:02:47.406492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:47 crc kubenswrapper[4778]: I0318 09:02:47.406512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:48 crc kubenswrapper[4778]: I0318 09:02:48.107149 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:48Z is after 2026-02-23T05:33:13Z Mar 18 09:02:48 crc kubenswrapper[4778]: I0318 09:02:48.401895 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:48 crc kubenswrapper[4778]: I0318 09:02:48.403272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:48 crc kubenswrapper[4778]: I0318 09:02:48.403319 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:48 crc kubenswrapper[4778]: I0318 09:02:48.403341 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:49 crc kubenswrapper[4778]: I0318 09:02:49.106121 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:49Z is after 2026-02-23T05:33:13Z Mar 18 09:02:49 crc kubenswrapper[4778]: E0318 09:02:49.740915 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:49Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.107088 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:50Z is after 2026-02-23T05:33:13Z Mar 18 09:02:50 crc kubenswrapper[4778]: E0318 09:02:50.124248 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:50Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.135500 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.136793 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.136845 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.136856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:50 crc kubenswrapper[4778]: I0318 09:02:50.136876 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:50 crc kubenswrapper[4778]: E0318 09:02:50.142515 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:50Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.104323 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:51Z is after 2026-02-23T05:33:13Z Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.186310 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.188060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.188130 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.188144 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.189035 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:51 crc kubenswrapper[4778]: I0318 09:02:51.412953 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.099971 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.100239 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.101661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.101743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.101763 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.103564 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:52Z is after 2026-02-23T05:33:13Z Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.421796 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.422752 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.425676 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" exitCode=255 Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.425749 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc"} Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.425823 4778 scope.go:117] "RemoveContainer" containerID="2ec4c4f24868fb18f75550635d869593b826b2bdf5c6c975c97b72521ef30263" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.425988 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.427662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.427732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.427750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:52 crc kubenswrapper[4778]: I0318 09:02:52.428568 4778 scope.go:117] "RemoveContainer" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" Mar 18 09:02:52 crc kubenswrapper[4778]: E0318 09:02:52.428853 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.104968 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:53Z is after 2026-02-23T05:33:13Z Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.426943 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.432138 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.435510 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.437104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.437151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.437165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:53 crc kubenswrapper[4778]: I0318 09:02:53.437941 4778 scope.go:117] "RemoveContainer" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" Mar 18 09:02:53 crc kubenswrapper[4778]: E0318 09:02:53.438178 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:54 crc kubenswrapper[4778]: I0318 09:02:54.104861 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:54Z is after 2026-02-23T05:33:13Z Mar 18 09:02:54 crc kubenswrapper[4778]: E0318 09:02:54.266693 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:02:55 crc kubenswrapper[4778]: I0318 09:02:55.083810 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:02:55 crc kubenswrapper[4778]: E0318 09:02:55.088659 4778 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:02:55 crc kubenswrapper[4778]: E0318 09:02:55.089913 4778 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 18 09:02:55 crc kubenswrapper[4778]: I0318 09:02:55.100627 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:02:55 crc kubenswrapper[4778]: I0318 09:02:55.100708 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:02:55 crc kubenswrapper[4778]: I0318 09:02:55.106942 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:55Z is after 2026-02-23T05:33:13Z Mar 18 09:02:56 crc kubenswrapper[4778]: I0318 09:02:56.107401 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:56Z is after 2026-02-23T05:33:13Z Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.105079 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:57Z is after 2026-02-23T05:33:13Z Mar 18 09:02:57 crc kubenswrapper[4778]: E0318 09:02:57.130783 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:57Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.142910 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.144450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.144599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.144721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.144862 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:02:57 crc kubenswrapper[4778]: E0318 09:02:57.151146 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:57Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.907968 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.908264 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.909994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.910054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.910076 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:02:57 crc kubenswrapper[4778]: I0318 09:02:57.911085 4778 scope.go:117] "RemoveContainer" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" Mar 18 09:02:57 crc kubenswrapper[4778]: E0318 09:02:57.911399 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:02:58 crc kubenswrapper[4778]: I0318 09:02:58.104586 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:58Z is after 2026-02-23T05:33:13Z Mar 18 09:02:59 crc kubenswrapper[4778]: I0318 09:02:59.107603 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:59Z is after 2026-02-23T05:33:13Z Mar 18 09:02:59 crc kubenswrapper[4778]: E0318 09:02:59.746826 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:59Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:02:59 crc kubenswrapper[4778]: W0318 09:02:59.911392 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:59Z is after 2026-02-23T05:33:13Z Mar 18 09:02:59 crc kubenswrapper[4778]: E0318 09:02:59.911510 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:02:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:03:00 crc kubenswrapper[4778]: I0318 09:03:00.106621 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:00Z is after 2026-02-23T05:33:13Z Mar 18 09:03:00 crc kubenswrapper[4778]: W0318 09:03:00.564018 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:00Z is after 2026-02-23T05:33:13Z Mar 18 09:03:00 crc kubenswrapper[4778]: E0318 09:03:00.564131 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:03:01 crc kubenswrapper[4778]: W0318 09:03:01.100852 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:01Z is after 2026-02-23T05:33:13Z Mar 18 09:03:01 crc kubenswrapper[4778]: E0318 09:03:01.100945 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:03:01 crc kubenswrapper[4778]: I0318 09:03:01.105983 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:01Z is after 2026-02-23T05:33:13Z Mar 18 09:03:02 crc kubenswrapper[4778]: I0318 09:03:02.107317 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:02Z is after 2026-02-23T05:33:13Z Mar 18 09:03:03 crc kubenswrapper[4778]: I0318 09:03:03.106683 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:03Z is after 2026-02-23T05:33:13Z Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.107108 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:04Z is after 2026-02-23T05:33:13Z Mar 18 09:03:04 crc kubenswrapper[4778]: E0318 09:03:04.137460 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:04Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.151851 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.153985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.154071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.154090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.154131 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:03:04 crc kubenswrapper[4778]: E0318 09:03:04.158561 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:04Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:03:04 crc kubenswrapper[4778]: E0318 09:03:04.267843 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.459951 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.460233 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.462081 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.462147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:04 crc kubenswrapper[4778]: I0318 09:03:04.462169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:05 crc kubenswrapper[4778]: I0318 09:03:05.100925 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:03:05 crc kubenswrapper[4778]: I0318 09:03:05.101023 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:03:05 crc kubenswrapper[4778]: I0318 09:03:05.104804 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:05Z is after 2026-02-23T05:33:13Z Mar 18 09:03:06 crc kubenswrapper[4778]: I0318 09:03:06.106892 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:06Z is after 2026-02-23T05:33:13Z Mar 18 09:03:07 crc kubenswrapper[4778]: I0318 09:03:07.105750 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:07Z is after 2026-02-23T05:33:13Z Mar 18 09:03:07 crc kubenswrapper[4778]: W0318 09:03:07.397394 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:07Z is after 2026-02-23T05:33:13Z Mar 18 09:03:07 crc kubenswrapper[4778]: E0318 09:03:07.397504 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 09:03:08 crc kubenswrapper[4778]: I0318 09:03:08.106919 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:08Z is after 2026-02-23T05:33:13Z Mar 18 09:03:09 crc kubenswrapper[4778]: I0318 09:03:09.106638 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:09Z is after 2026-02-23T05:33:13Z Mar 18 09:03:09 crc kubenswrapper[4778]: E0318 09:03:09.754585 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:09Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:10 crc kubenswrapper[4778]: I0318 09:03:10.107245 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:10Z is after 2026-02-23T05:33:13Z Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.105875 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:11Z is after 2026-02-23T05:33:13Z Mar 18 09:03:11 crc kubenswrapper[4778]: E0318 09:03:11.140641 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:11Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.159435 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.160753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.160810 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.160824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:11 crc kubenswrapper[4778]: I0318 09:03:11.160852 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:03:11 crc kubenswrapper[4778]: E0318 09:03:11.164125 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:03:11Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.107695 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.186489 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.187802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.187944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.187966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.188908 4778 scope.go:117] "RemoveContainer" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.500544 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.503625 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e"} Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.503841 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.505326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.505405 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:12 crc kubenswrapper[4778]: I0318 09:03:12.505434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.106822 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.508340 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.509025 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.511530 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" exitCode=255 Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.511579 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e"} Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.511624 4778 scope.go:117] "RemoveContainer" containerID="bcc95446cf76ba859078bd26abff4e93cf29005e9f95ebef436cbcaa16e7aacc" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.511818 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.513579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.513666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.513697 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:13 crc kubenswrapper[4778]: I0318 09:03:13.514828 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:13 crc kubenswrapper[4778]: E0318 09:03:13.515306 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:14 crc kubenswrapper[4778]: I0318 09:03:14.107329 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:14 crc kubenswrapper[4778]: E0318 09:03:14.268518 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:03:14 crc kubenswrapper[4778]: I0318 09:03:14.517562 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.101523 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.101644 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.101930 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.102169 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.103828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.103910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.103933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.104915 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.105099 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5" gracePeriod=30 Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.109669 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.526428 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.528583 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.529308 4778 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5" exitCode=255 Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.529378 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5"} Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.529428 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89"} Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.529459 4778 scope.go:117] "RemoveContainer" containerID="c06152147e770c821660f00750e87810309b3aa8dbbd6e1ee031be349b3cf44e" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.529684 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.531150 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.531224 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:15 crc kubenswrapper[4778]: I0318 09:03:15.531238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.109448 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.533331 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.790930 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.791151 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.792870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.792936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:16 crc kubenswrapper[4778]: I0318 09:03:16.792962 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.107420 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.908315 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.908672 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.910485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.910551 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.910577 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:17 crc kubenswrapper[4778]: I0318 09:03:17.911787 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:17 crc kubenswrapper[4778]: E0318 09:03:17.912153 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.107380 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:18 crc kubenswrapper[4778]: E0318 09:03:18.150347 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.164374 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.165727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.165758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.165768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:18 crc kubenswrapper[4778]: I0318 09:03:18.165788 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:03:18 crc kubenswrapper[4778]: E0318 09:03:18.170075 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 09:03:19 crc kubenswrapper[4778]: I0318 09:03:19.109236 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.764645 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de4040018e9c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,LastTimestamp:2026-03-18 09:02:14.086478274 +0000 UTC m=+0.661223124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.771555 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.776660 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.781692 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.786184 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de4040a7e11bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.260879807 +0000 UTC m=+0.835624657,LastTimestamp:2026-03-18 09:02:14.260879807 +0000 UTC m=+0.835624657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.792546 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.287951336 +0000 UTC m=+0.862696176,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.797019 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.287973846 +0000 UTC m=+0.862718686,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.801527 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.287983395 +0000 UTC m=+0.862728225,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.805515 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.289061094 +0000 UTC m=+0.863805934,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.812426 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.289077003 +0000 UTC m=+0.863821843,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.816529 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.289085523 +0000 UTC m=+0.863830363,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.821595 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.28983492 +0000 UTC m=+0.864579760,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.826416 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.2898511 +0000 UTC m=+0.864595940,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.833670 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.289864659 +0000 UTC m=+0.864609499,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.840580 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.290140431 +0000 UTC m=+0.864885301,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.845222 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.29016374 +0000 UTC m=+0.864908610,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.851934 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.29018156 +0000 UTC m=+0.864926430,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.858661 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.291121641 +0000 UTC m=+0.865866481,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.865889 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.291133341 +0000 UTC m=+0.865878181,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.871221 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.291141271 +0000 UTC m=+0.865886101,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.877270 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.293854019 +0000 UTC m=+0.868598879,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.883536 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.293887158 +0000 UTC m=+0.868632008,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.887650 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d5de5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d5de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161219045 +0000 UTC m=+0.735963875,LastTimestamp:2026-03-18 09:02:14.293908458 +0000 UTC m=+0.868653318,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.891823 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048cae3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048cae3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161174076 +0000 UTC m=+0.735918916,LastTimestamp:2026-03-18 09:02:14.294343255 +0000 UTC m=+0.869088095,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.896780 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189de404048d2ded\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189de404048d2ded default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.161206765 +0000 UTC m=+0.735951605,LastTimestamp:2026-03-18 09:02:14.294362894 +0000 UTC m=+0.869107734,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.902526 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de40424802223 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.697222691 +0000 UTC m=+1.271967541,LastTimestamp:2026-03-18 09:02:14.697222691 +0000 UTC m=+1.271967541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.907336 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de4042492af88 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.698438536 +0000 UTC m=+1.273183386,LastTimestamp:2026-03-18 09:02:14.698438536 +0000 UTC m=+1.273183386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.911045 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40424d911bd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.703051197 +0000 UTC m=+1.277796047,LastTimestamp:2026-03-18 09:02:14.703051197 +0000 UTC m=+1.277796047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.917380 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de404256e6fbc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.712840124 +0000 UTC m=+1.287584954,LastTimestamp:2026-03-18 09:02:14.712840124 +0000 UTC m=+1.287584954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.921982 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4042590f1d8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:14.715101656 +0000 UTC m=+1.289846506,LastTimestamp:2026-03-18 09:02:14.715101656 +0000 UTC m=+1.289846506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.926605 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40447d87c6b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.290215531 +0000 UTC m=+1.864960371,LastTimestamp:2026-03-18 09:02:15.290215531 +0000 UTC m=+1.864960371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.931301 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de40447e253c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.290860481 +0000 UTC m=+1.865605361,LastTimestamp:2026-03-18 09:02:15.290860481 +0000 UTC m=+1.865605361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.935898 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40447e24e5d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.290859101 +0000 UTC m=+1.865603941,LastTimestamp:2026-03-18 09:02:15.290859101 +0000 UTC m=+1.865603941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.940609 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de40447e34ecd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.290924749 +0000 UTC m=+1.865669589,LastTimestamp:2026-03-18 09:02:15.290924749 +0000 UTC m=+1.865669589,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.944946 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de40447e4e5a0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.291028896 +0000 UTC m=+1.865773736,LastTimestamp:2026-03-18 09:02:15.291028896 +0000 UTC m=+1.865773736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.951539 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de404489d0584 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.303095684 +0000 UTC m=+1.877840534,LastTimestamp:2026-03-18 09:02:15.303095684 +0000 UTC m=+1.877840534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.957878 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40448d20cf9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.306571001 +0000 UTC m=+1.881315851,LastTimestamp:2026-03-18 09:02:15.306571001 +0000 UTC m=+1.881315851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.962477 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de40448d30b43 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.306636099 +0000 UTC m=+1.881380939,LastTimestamp:2026-03-18 09:02:15.306636099 +0000 UTC m=+1.881380939,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.967980 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de40448d6a152 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.306871122 +0000 UTC m=+1.881615972,LastTimestamp:2026-03-18 09:02:15.306871122 +0000 UTC m=+1.881615972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.973659 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40448e2d658 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.307671128 +0000 UTC m=+1.882415978,LastTimestamp:2026-03-18 09:02:15.307671128 +0000 UTC m=+1.882415978,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.980598 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40448f7c693 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.309043347 +0000 UTC m=+1.883788197,LastTimestamp:2026-03-18 09:02:15.309043347 +0000 UTC m=+1.883788197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.987448 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4045c2c6264 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.631258212 +0000 UTC m=+2.206003122,LastTimestamp:2026-03-18 09:02:15.631258212 +0000 UTC m=+2.206003122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:19 crc kubenswrapper[4778]: E0318 09:03:19.993562 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4045ce870c1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.643582657 +0000 UTC m=+2.218327537,LastTimestamp:2026-03-18 09:02:15.643582657 +0000 UTC m=+2.218327537,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.000441 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4045cfd2478 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.644939384 +0000 UTC m=+2.219684254,LastTimestamp:2026-03-18 09:02:15.644939384 +0000 UTC m=+2.219684254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.006406 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4046a8ff938 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.872665912 +0000 UTC m=+2.447410762,LastTimestamp:2026-03-18 09:02:15.872665912 +0000 UTC m=+2.447410762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.011229 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4046b3f1f0b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.884144395 +0000 UTC m=+2.458889245,LastTimestamp:2026-03-18 09:02:15.884144395 +0000 UTC m=+2.458889245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.015973 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4046b5aaf90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.885950864 +0000 UTC m=+2.460695744,LastTimestamp:2026-03-18 09:02:15.885950864 +0000 UTC m=+2.460695744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.020827 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4047919a255 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.116568661 +0000 UTC m=+2.691313581,LastTimestamp:2026-03-18 09:02:16.116568661 +0000 UTC m=+2.691313581,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.022595 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40479f098b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.130656432 +0000 UTC m=+2.705401272,LastTimestamp:2026-03-18 09:02:16.130656432 +0000 UTC m=+2.705401272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.027801 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4047e87721f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.207651359 +0000 UTC m=+2.782396199,LastTimestamp:2026-03-18 09:02:16.207651359 +0000 UTC m=+2.782396199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.033327 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4047e907b3c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.208243516 +0000 UTC m=+2.782988396,LastTimestamp:2026-03-18 09:02:16.208243516 +0000 UTC m=+2.782988396,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.038769 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de4047ea52237 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.209596983 +0000 UTC m=+2.784341863,LastTimestamp:2026-03-18 09:02:16.209596983 +0000 UTC m=+2.784341863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.043364 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4047ee73632 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.213927474 +0000 UTC m=+2.788672344,LastTimestamp:2026-03-18 09:02:16.213927474 +0000 UTC m=+2.788672344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.049433 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de4048c3c5b4e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.437611342 +0000 UTC m=+3.012356182,LastTimestamp:2026-03-18 09:02:16.437611342 +0000 UTC m=+3.012356182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.054536 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4048c8347ca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.442259402 +0000 UTC m=+3.017004252,LastTimestamp:2026-03-18 09:02:16.442259402 +0000 UTC m=+3.017004252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.068942 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4048ce9a67e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.448968318 +0000 UTC m=+3.023713158,LastTimestamp:2026-03-18 09:02:16.448968318 +0000 UTC m=+3.023713158,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.076853 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4048d962f7e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.460275582 +0000 UTC m=+3.035020422,LastTimestamp:2026-03-18 09:02:16.460275582 +0000 UTC m=+3.035020422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.083098 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189de4048dba4b6b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.462642027 +0000 UTC m=+3.037386867,LastTimestamp:2026-03-18 09:02:16.462642027 +0000 UTC m=+3.037386867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.087871 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4048dc42298 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.463286936 +0000 UTC m=+3.038031776,LastTimestamp:2026-03-18 09:02:16.463286936 +0000 UTC m=+3.038031776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.091783 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4048de52811 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.465451025 +0000 UTC m=+3.040195865,LastTimestamp:2026-03-18 09:02:16.465451025 +0000 UTC m=+3.040195865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.095598 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4048eef2c52 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.48288469 +0000 UTC m=+3.057629530,LastTimestamp:2026-03-18 09:02:16.48288469 +0000 UTC m=+3.057629530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.101125 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4048efc041b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.483726363 +0000 UTC m=+3.058471213,LastTimestamp:2026-03-18 09:02:16.483726363 +0000 UTC m=+3.058471213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: I0318 09:03:20.105483 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.105606 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4048f243812 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.486361106 +0000 UTC m=+3.061105946,LastTimestamp:2026-03-18 09:02:16.486361106 +0000 UTC m=+3.061105946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.109671 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4049952146c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.657138796 +0000 UTC m=+3.231883636,LastTimestamp:2026-03-18 09:02:16.657138796 +0000 UTC m=+3.231883636,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.113244 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de40499e522bd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.666776253 +0000 UTC m=+3.241521093,LastTimestamp:2026-03-18 09:02:16.666776253 +0000 UTC m=+3.241521093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.117053 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4049a1a8e51 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.670277201 +0000 UTC m=+3.245022041,LastTimestamp:2026-03-18 09:02:16.670277201 +0000 UTC m=+3.245022041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.120552 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de4049a27a027 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.671133735 +0000 UTC m=+3.245878575,LastTimestamp:2026-03-18 09:02:16.671133735 +0000 UTC m=+3.245878575,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.125160 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4049c0b6c89 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.702839945 +0000 UTC m=+3.277584785,LastTimestamp:2026-03-18 09:02:16.702839945 +0000 UTC m=+3.277584785,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.127100 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de4049c2105ca openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.704255434 +0000 UTC m=+3.279000274,LastTimestamp:2026-03-18 09:02:16.704255434 +0000 UTC m=+3.279000274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.132799 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de404a4fab8b4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.852740276 +0000 UTC m=+3.427485116,LastTimestamp:2026-03-18 09:02:16.852740276 +0000 UTC m=+3.427485116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.138607 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189de404a5b9118e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.865214862 +0000 UTC m=+3.439959702,LastTimestamp:2026-03-18 09:02:16.865214862 +0000 UTC m=+3.439959702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.144587 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404a72f62e9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.889746153 +0000 UTC m=+3.464490993,LastTimestamp:2026-03-18 09:02:16.889746153 +0000 UTC m=+3.464490993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.151227 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404a8534dee openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.908877294 +0000 UTC m=+3.483622134,LastTimestamp:2026-03-18 09:02:16.908877294 +0000 UTC m=+3.483622134,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.155676 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404a8682672 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:16.910243442 +0000 UTC m=+3.484988282,LastTimestamp:2026-03-18 09:02:16.910243442 +0000 UTC m=+3.484988282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.159676 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404b1a88a19 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.065458201 +0000 UTC m=+3.640203051,LastTimestamp:2026-03-18 09:02:17.065458201 +0000 UTC m=+3.640203051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.164092 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404b294f114 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.08095106 +0000 UTC m=+3.655695890,LastTimestamp:2026-03-18 09:02:17.08095106 +0000 UTC m=+3.655695890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.167524 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404b2a4ef42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.08199917 +0000 UTC m=+3.656744010,LastTimestamp:2026-03-18 09:02:17.08199917 +0000 UTC m=+3.656744010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.172113 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de404bc639bf8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.245490168 +0000 UTC m=+3.820235038,LastTimestamp:2026-03-18 09:02:17.245490168 +0000 UTC m=+3.820235038,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.176701 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404be39bb3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.276300093 +0000 UTC m=+3.851044923,LastTimestamp:2026-03-18 09:02:17.276300093 +0000 UTC m=+3.851044923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.182097 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404bec744be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.28557587 +0000 UTC m=+3.860320710,LastTimestamp:2026-03-18 09:02:17.28557587 +0000 UTC m=+3.860320710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.187128 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de404c85827ea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.446066154 +0000 UTC m=+4.020810994,LastTimestamp:2026-03-18 09:02:17.446066154 +0000 UTC m=+4.020810994,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.192142 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de404c92d86b9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.460049593 +0000 UTC m=+4.034794433,LastTimestamp:2026-03-18 09:02:17.460049593 +0000 UTC m=+4.034794433,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.196869 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de404f805ad4a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.245967178 +0000 UTC m=+4.820712018,LastTimestamp:2026-03-18 09:02:18.245967178 +0000 UTC m=+4.820712018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.204860 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40504d2f3f4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.46074674 +0000 UTC m=+5.035491580,LastTimestamp:2026-03-18 09:02:18.46074674 +0000 UTC m=+5.035491580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.211037 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40505837167 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.472313191 +0000 UTC m=+5.047058031,LastTimestamp:2026-03-18 09:02:18.472313191 +0000 UTC m=+5.047058031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.217488 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4050595766c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.473494124 +0000 UTC m=+5.048238974,LastTimestamp:2026-03-18 09:02:18.473494124 +0000 UTC m=+5.048238974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.222519 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40510cdd583 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.661737859 +0000 UTC m=+5.236482699,LastTimestamp:2026-03-18 09:02:18.661737859 +0000 UTC m=+5.236482699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.227609 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de405119c5cfb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.675272955 +0000 UTC m=+5.250017795,LastTimestamp:2026-03-18 09:02:18.675272955 +0000 UTC m=+5.250017795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.230400 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de40511af672e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.67652075 +0000 UTC m=+5.251265590,LastTimestamp:2026-03-18 09:02:18.67652075 +0000 UTC m=+5.251265590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.234508 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4051e23e41b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.885481499 +0000 UTC m=+5.460226329,LastTimestamp:2026-03-18 09:02:18.885481499 +0000 UTC m=+5.460226329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.241562 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4051f17d6de openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.901468894 +0000 UTC m=+5.476213764,LastTimestamp:2026-03-18 09:02:18.901468894 +0000 UTC m=+5.476213764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.248520 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4051f306edd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:18.903080669 +0000 UTC m=+5.477825509,LastTimestamp:2026-03-18 09:02:18.903080669 +0000 UTC m=+5.477825509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.255725 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4052db6764e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:19.146745422 +0000 UTC m=+5.721490272,LastTimestamp:2026-03-18 09:02:19.146745422 +0000 UTC m=+5.721490272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.262352 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4052eb43dbb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:19.163377083 +0000 UTC m=+5.738121933,LastTimestamp:2026-03-18 09:02:19.163377083 +0000 UTC m=+5.738121933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.268934 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4052ec8305c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:19.16468438 +0000 UTC m=+5.739429230,LastTimestamp:2026-03-18 09:02:19.16468438 +0000 UTC m=+5.739429230,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.276665 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4053ba7950b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:19.380651275 +0000 UTC m=+5.955396105,LastTimestamp:2026-03-18 09:02:19.380651275 +0000 UTC m=+5.955396105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.280646 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189de4053cd659b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:19.400493495 +0000 UTC m=+5.975238335,LastTimestamp:2026-03-18 09:02:19.400493495 +0000 UTC m=+5.975238335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.290718 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-controller-manager-crc.189de40690a67ac0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:25.101609664 +0000 UTC m=+11.676354544,LastTimestamp:2026-03-18 09:02:25.101609664 +0000 UTC m=+11.676354544,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.295340 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40690a7fba5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:25.101708197 +0000 UTC m=+11.676453077,LastTimestamp:2026-03-18 09:02:25.101708197 +0000 UTC m=+11.676453077,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.299158 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-apiserver-crc.189de4074509819d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:38612->192.168.126.11:17697: read: connection reset by peer Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:28.127998365 +0000 UTC m=+14.702743255,LastTimestamp:2026-03-18 09:02:28.127998365 +0000 UTC m=+14.702743255,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.305155 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de407450ab9f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38612->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:28.128078327 +0000 UTC m=+14.702823207,LastTimestamp:2026-03-18 09:02:28.128078327 +0000 UTC m=+14.702823207,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.313020 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189de404b2a4ef42\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404b2a4ef42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.08199917 +0000 UTC m=+3.656744010,LastTimestamp:2026-03-18 09:02:28.310736445 +0000 UTC m=+14.885481325,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.320453 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189de404be39bb3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404be39bb3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.276300093 +0000 UTC m=+3.851044923,LastTimestamp:2026-03-18 09:02:28.57501725 +0000 UTC m=+15.149762090,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.326864 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189de404bec744be\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de404bec744be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:17.28557587 +0000 UTC m=+3.860320710,LastTimestamp:2026-03-18 09:02:28.584920166 +0000 UTC m=+15.159665016,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.334245 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-apiserver-crc.189de407a443dec6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 09:03:20 crc kubenswrapper[4778]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 09:03:20 crc kubenswrapper[4778]: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:29.725658822 +0000 UTC m=+16.300403692,LastTimestamp:2026-03-18 09:02:29.725658822 +0000 UTC m=+16.300403692,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.338879 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de407a446efae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:29.725859758 +0000 UTC m=+16.300604628,LastTimestamp:2026-03-18 09:02:29.725859758 +0000 UTC m=+16.300604628,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.346276 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b15262 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101540962 +0000 UTC m=+21.676285832,LastTimestamp:2026-03-18 09:02:35.101540962 +0000 UTC m=+21.676285832,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.353393 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b24d22 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101605154 +0000 UTC m=+21.676350024,LastTimestamp:2026-03-18 09:02:35.101605154 +0000 UTC m=+21.676350024,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.363600 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de408e4b15262\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b15262 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101540962 +0000 UTC m=+21.676285832,LastTimestamp:2026-03-18 09:02:45.100912731 +0000 UTC m=+31.675657601,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.373605 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de408e4b24d22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b24d22 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101605154 +0000 UTC m=+21.676350024,LastTimestamp:2026-03-18 09:02:45.100975453 +0000 UTC m=+31.675720333,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.381548 4778 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40b38e3b9a8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:45.104064936 +0000 UTC m=+31.678809836,LastTimestamp:2026-03-18 09:02:45.104064936 +0000 UTC m=+31.678809836,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.389652 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de40448e2d658\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de40448e2d658 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.307671128 +0000 UTC m=+1.882415978,LastTimestamp:2026-03-18 09:02:45.225294967 +0000 UTC m=+31.800039837,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.397136 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de4045c2c6264\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4045c2c6264 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.631258212 +0000 UTC m=+2.206003122,LastTimestamp:2026-03-18 09:02:45.463167835 +0000 UTC m=+32.037912685,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.404193 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de4045ce870c1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de4045ce870c1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:15.643582657 +0000 UTC m=+2.218327537,LastTimestamp:2026-03-18 09:02:45.472340251 +0000 UTC m=+32.047085101,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.414361 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de408e4b15262\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b15262 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101540962 +0000 UTC m=+21.676285832,LastTimestamp:2026-03-18 09:02:55.100684971 +0000 UTC m=+41.675429831,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.421089 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de408e4b24d22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b24d22 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101605154 +0000 UTC m=+21.676350024,LastTimestamp:2026-03-18 09:02:55.100766593 +0000 UTC m=+41.675511473,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:03:20 crc kubenswrapper[4778]: E0318 09:03:20.430446 4778 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de408e4b15262\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 09:03:20 crc kubenswrapper[4778]: &Event{ObjectMeta:{kube-controller-manager-crc.189de408e4b15262 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 09:03:20 crc kubenswrapper[4778]: body: Mar 18 09:03:20 crc kubenswrapper[4778]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:02:35.101540962 +0000 UTC m=+21.676285832,LastTimestamp:2026-03-18 09:03:05.100995927 +0000 UTC m=+51.675740777,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 09:03:20 crc kubenswrapper[4778]: > Mar 18 09:03:21 crc kubenswrapper[4778]: I0318 09:03:21.108458 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.101083 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.101389 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.107219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.107250 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.107267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.111937 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.116015 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.551344 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.552587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.552636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:22 crc kubenswrapper[4778]: I0318 09:03:22.552652 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.107278 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.426424 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.426741 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.429529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.429574 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.429591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:23 crc kubenswrapper[4778]: I0318 09:03:23.430355 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:23 crc kubenswrapper[4778]: E0318 09:03:23.430653 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:24 crc kubenswrapper[4778]: I0318 09:03:24.106341 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:24 crc kubenswrapper[4778]: E0318 09:03:24.269276 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.107136 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:25 crc kubenswrapper[4778]: E0318 09:03:25.157160 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.170777 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.172113 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.172157 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.172172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:25 crc kubenswrapper[4778]: I0318 09:03:25.172214 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:03:25 crc kubenswrapper[4778]: E0318 09:03:25.176512 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.107955 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.186867 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.188567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.188698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.188775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.794344 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.794886 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.796259 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.796290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:26 crc kubenswrapper[4778]: I0318 09:03:26.796299 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:27 crc kubenswrapper[4778]: I0318 09:03:27.092223 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 09:03:27 crc kubenswrapper[4778]: I0318 09:03:27.106398 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:27 crc kubenswrapper[4778]: I0318 09:03:27.109742 4778 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 09:03:27 crc kubenswrapper[4778]: W0318 09:03:27.416140 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 09:03:27 crc kubenswrapper[4778]: E0318 09:03:27.416524 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 09:03:28 crc kubenswrapper[4778]: I0318 09:03:28.107958 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 09:03:29 crc kubenswrapper[4778]: I0318 09:03:29.058928 4778 csr.go:261] certificate signing request csr-cbhwp is approved, waiting to be issued Mar 18 09:03:29 crc kubenswrapper[4778]: I0318 09:03:29.067231 4778 csr.go:257] certificate signing request csr-cbhwp is issued Mar 18 09:03:29 crc kubenswrapper[4778]: I0318 09:03:29.120784 4778 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 09:03:29 crc kubenswrapper[4778]: I0318 09:03:29.707686 4778 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 09:03:29 crc kubenswrapper[4778]: I0318 09:03:29.939011 4778 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 09:03:29 crc kubenswrapper[4778]: W0318 09:03:29.939282 4778 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.068851 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-18 13:20:11.739375083 +0000 UTC Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.068902 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5884h16m41.670477987s for next certificate rotation Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.118237 4778 apiserver.go:52] "Watching apiserver" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.127624 4778 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.127907 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.128481 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.128542 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.128562 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.128665 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.128797 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.128825 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.129098 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.129424 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.129480 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.131017 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.131081 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.131615 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.131767 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.132607 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.132697 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.133765 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.134117 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.134145 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.166931 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.182096 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.196566 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.204146 4778 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.213380 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228517 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228747 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228817 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228858 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228903 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228940 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.228980 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229014 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229048 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229129 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229168 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229237 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229286 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229293 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229323 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229399 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229432 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229467 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229506 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229545 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229584 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229619 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229656 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229690 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229724 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229758 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229791 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229897 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229910 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229938 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.229948 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230019 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230078 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230138 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230155 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230190 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230288 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230298 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230339 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230344 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230443 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230484 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230519 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230525 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230584 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230613 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230639 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230661 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230687 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230731 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230753 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230776 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230799 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230821 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230873 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230886 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230950 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.230989 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231029 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231067 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231090 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231102 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231138 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231122 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231175 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231246 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231285 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231319 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231353 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231392 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231433 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231469 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231504 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231538 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231541 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231594 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231610 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231618 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231667 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231693 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231716 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231742 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231735 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231771 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231796 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231817 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231829 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231864 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231864 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231889 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231913 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231937 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231961 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231943 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.231985 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232013 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232034 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232080 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232105 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232129 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232149 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232171 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232218 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232244 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232268 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232292 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232315 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232338 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232360 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232386 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232406 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232452 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232484 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232507 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232529 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232550 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232570 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232591 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232611 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232639 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232660 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232681 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232702 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232711 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232724 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232779 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232808 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232837 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232890 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232918 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232945 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232983 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232997 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.232999 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233024 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233052 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233107 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233131 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233152 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233174 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233219 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233243 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233268 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233291 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233315 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233336 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233382 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233404 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233428 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233451 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233474 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233495 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233518 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233541 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233563 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233587 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233610 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233632 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233654 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233677 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233701 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233723 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233743 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233768 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233792 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233817 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233863 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233885 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233881 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233913 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233899 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233936 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.233980 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234028 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234098 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234235 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234302 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234360 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234417 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234471 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234532 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234576 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234613 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234628 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234634 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234659 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234709 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234715 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234760 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234763 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234822 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234851 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234846 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234915 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235256 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.234878 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235391 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235467 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235527 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235710 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235775 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235831 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235887 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235938 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235992 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236049 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236106 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236164 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236328 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236383 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236449 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236505 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236557 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236617 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236719 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236783 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236831 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236889 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236978 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237036 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237127 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237243 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237310 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237418 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237460 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237502 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237550 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237643 4778 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237672 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237697 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237721 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237743 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237766 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237790 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237812 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237833 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237854 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237874 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237896 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237918 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237939 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237960 4778 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237981 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238002 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238026 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238052 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238087 4778 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238109 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238136 4778 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238169 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238239 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238266 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238288 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238310 4778 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238334 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238356 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238385 4778 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238407 4778 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238433 4778 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235375 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235466 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235586 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235954 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.235982 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236097 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.236135 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237295 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.237480 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238100 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238253 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.238635 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.239001 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.239676 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.240398 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.240440 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.240561 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241055 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241242 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241271 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241318 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241339 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.241346 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.241522 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:03:30.741477675 +0000 UTC m=+77.316222565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.243715 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.244165 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.244520 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.244568 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.244731 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.244970 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245456 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245707 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245799 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245819 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245904 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.245550 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246070 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246096 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246432 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246488 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246515 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246566 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246588 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246925 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.246981 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247042 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247012 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247168 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247481 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247506 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.247647 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.248453 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.249777 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.249792 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250137 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250401 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250458 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250517 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250721 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250779 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.250900 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251082 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251260 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251283 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251433 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251592 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.251916 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.252806 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.252845 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.252991 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.253365 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.253400 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.253417 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.253842 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.254295 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.254514 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.254743 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.254855 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.255317 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.256499 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.256688 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.256793 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.257391 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.257457 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.257824 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.258305 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.258094 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.258648 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.258905 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.259009 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:30.75896998 +0000 UTC m=+77.333714830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.259044 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.259566 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.260052 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.260065 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.259651 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.259710 4778 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.260836 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.261341 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.261436 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.262444 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.263245 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.263646 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.263917 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.264136 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.264163 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.264589 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.264750 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.265575 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.267119 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.267378 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.267653 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.267976 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.268005 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.268087 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:30.768070401 +0000 UTC m=+77.342815251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.268128 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.268419 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.268612 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.268863 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.269711 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.269920 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.270113 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.270151 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.270803 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.271507 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.273457 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.273505 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.273532 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.273659 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:30.773627819 +0000 UTC m=+77.348372699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.277217 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.280037 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.282399 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.282439 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.282462 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.282545 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:30.782519956 +0000 UTC m=+77.357264806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.283056 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.283260 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.283758 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.285066 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.285536 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286017 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286089 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286040 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286477 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286382 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.286457 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287096 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287145 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287170 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287446 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287236 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.287872 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.288059 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.288566 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.289370 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.289846 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.290549 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.290790 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.290943 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.291412 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.291625 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.291950 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.292087 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.292130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.292609 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.292707 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293494 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293496 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293531 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293843 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293908 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.293969 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.294008 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.294163 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.294463 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.294850 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.294967 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.295445 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.295457 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.295637 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.297629 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.298209 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.307614 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.312572 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.322054 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.327371 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.329848 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.332049 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339319 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339475 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339493 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339507 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339523 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339536 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339550 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339563 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339576 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339592 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339608 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339624 4778 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339637 4778 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339649 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339662 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339675 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339687 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339591 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339700 4778 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339761 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339817 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339831 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339845 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339862 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339880 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339897 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339911 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339923 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339937 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339950 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339964 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339979 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.339992 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340005 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340018 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340030 4778 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340043 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340055 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340070 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340084 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340100 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340114 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340128 4778 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340141 4778 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340154 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340166 4778 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340179 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340192 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340227 4778 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340240 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340253 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340265 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340278 4778 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340290 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340302 4778 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340316 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340329 4778 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340341 4778 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340355 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340368 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340379 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340393 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340409 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340422 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340434 4778 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340447 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340459 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340470 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340482 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340494 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340505 4778 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340517 4778 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340529 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340541 4778 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340553 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340566 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340578 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340591 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340606 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340620 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340632 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340644 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340656 4778 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340669 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340681 4778 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340693 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340707 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340722 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340736 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340751 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340764 4778 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340777 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340791 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340803 4778 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340815 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340828 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340839 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340852 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340868 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340886 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340900 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340912 4778 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340924 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340937 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340949 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340961 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340973 4778 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340986 4778 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.340999 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341011 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341022 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341034 4778 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341045 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341057 4778 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341069 4778 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341080 4778 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341091 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341103 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341147 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341160 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341172 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341184 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341214 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341227 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341242 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341254 4778 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341267 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341278 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341290 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341302 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341315 4778 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341328 4778 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341340 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341354 4778 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341366 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341379 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341392 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341404 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341417 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341431 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341444 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341456 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341469 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341481 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341494 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341506 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341518 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341531 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341543 4778 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341555 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341567 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341616 4778 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341630 4778 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341644 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341658 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341671 4778 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341684 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341697 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341710 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341722 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341734 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341749 4778 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341761 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.341772 4778 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.451103 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.469751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.478016 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:03:30 crc kubenswrapper[4778]: else Mar 18 09:03:30 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:03:30 crc kubenswrapper[4778]: exit 1 Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.479268 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.483456 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 09:03:30 crc kubenswrapper[4778]: W0318 09:03:30.485569 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-0a13c0f6f9114eaf03952727eeb70b44491688bc42826224aea18889d06b7473 WatchSource:0}: Error finding container 0a13c0f6f9114eaf03952727eeb70b44491688bc42826224aea18889d06b7473: Status 404 returned error can't find the container with id 0a13c0f6f9114eaf03952727eeb70b44491688bc42826224aea18889d06b7473 Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.491488 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:30 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:03:30 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:03:30 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:03:30 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:03:30 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:03:30 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:03:30 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:03:30 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:03:30 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:03:30 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:03:30 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.495050 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:30 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: Mar 18 09:03:30 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:03:30 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: W0318 09:03:30.495661 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b77fa638174657854f588becf949f187b6c3bc1b97dc647bce430433e94d60ff WatchSource:0}: Error finding container b77fa638174657854f588becf949f187b6c3bc1b97dc647bce430433e94d60ff: Status 404 returned error can't find the container with id b77fa638174657854f588becf949f187b6c3bc1b97dc647bce430433e94d60ff Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.496594 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.499392 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.500585 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.577022 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0a13c0f6f9114eaf03952727eeb70b44491688bc42826224aea18889d06b7473"} Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.579126 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"31e3fc40d1b9e93db517675fadb95d616ebb6f222bec7672ed9fd7398ad8be72"} Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.580169 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:30 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:03:30 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:03:30 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:03:30 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:03:30 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:03:30 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:03:30 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:03:30 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:03:30 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:03:30 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:03:30 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.581770 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b77fa638174657854f588becf949f187b6c3bc1b97dc647bce430433e94d60ff"} Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.582077 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:03:30 crc kubenswrapper[4778]: else Mar 18 09:03:30 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:03:30 crc kubenswrapper[4778]: exit 1 Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.583342 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.583794 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:30 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:30 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:30 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:30 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:30 crc kubenswrapper[4778]: fi Mar 18 09:03:30 crc kubenswrapper[4778]: Mar 18 09:03:30 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:03:30 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:30 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:03:30 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:03:30 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:30 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:30 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.584931 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.584981 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.586785 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.597599 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.614994 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.629720 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.646804 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.658461 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.668424 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.679957 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.691026 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.702642 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.714267 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.725906 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.738331 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.745524 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.745787 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:03:31.745744932 +0000 UTC m=+78.320489782 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.846764 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.846815 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.846836 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:30 crc kubenswrapper[4778]: I0318 09:03:30.846855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847023 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847073 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847095 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847119 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847141 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:31.847114487 +0000 UTC m=+78.421859337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847159 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847184 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847130 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847273 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:31.847253381 +0000 UTC m=+78.421998261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847304 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847305 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:31.847290142 +0000 UTC m=+78.422035022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:30 crc kubenswrapper[4778]: E0318 09:03:30.847353 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:31.847334933 +0000 UTC m=+78.422079773 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: I0318 09:03:31.758770 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.758969 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:03:33.758928231 +0000 UTC m=+80.333673101 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:31 crc kubenswrapper[4778]: I0318 09:03:31.859646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:31 crc kubenswrapper[4778]: I0318 09:03:31.859744 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:31 crc kubenswrapper[4778]: I0318 09:03:31.859815 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.859863 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.859982 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:33.859947606 +0000 UTC m=+80.434692486 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860000 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860045 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860074 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:33.860050979 +0000 UTC m=+80.434795859 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860090 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860115 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860185 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:33.860161442 +0000 UTC m=+80.434906322 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860235 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860280 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860314 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:31 crc kubenswrapper[4778]: I0318 09:03:31.859868 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:31 crc kubenswrapper[4778]: E0318 09:03:31.860373 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:33.860352977 +0000 UTC m=+80.435097857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.176617 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.178812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.178893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.178919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.179024 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.186904 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.186936 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.187070 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.187139 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.187383 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.187509 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.191990 4778 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.192183 4778 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.193971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.194020 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.194037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.194063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.194084 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.197822 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.199104 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.201468 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.203625 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.206312 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.208185 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.209747 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.212130 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.213912 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.216271 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.217655 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.220301 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.221713 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.223138 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.224232 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.225293 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.226559 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.229000 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.230080 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.231637 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.233937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.234037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.234133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.234191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.234278 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.235180 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.236446 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.237988 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.239842 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.241683 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.243900 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.245434 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.247930 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.248998 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.251364 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.252395 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.253839 4778 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.254122 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.256665 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.257728 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.258968 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.260099 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.263064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.263142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.263170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.263239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.263267 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.265720 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.269284 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.270805 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.272624 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.275164 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.276442 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.278144 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.278671 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.280240 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.281541 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.283918 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.283965 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.283980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.284001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.284016 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.284101 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.285622 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.287748 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.288668 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.289723 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.290350 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.291330 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.291907 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.292532 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.293714 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.303000 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.309295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.309343 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.309361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.309387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.309405 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.326266 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:32 crc kubenswrapper[4778]: E0318 09:03:32.326642 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.328533 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.328588 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.328601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.328624 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.328638 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.431944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.432082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.432102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.432129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.432176 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.534665 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.534720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.534744 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.534768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.534787 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.639732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.639803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.639822 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.639853 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.639873 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.743127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.743247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.743270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.743486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.743515 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.847378 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.847440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.847450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.847469 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.847482 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.952026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.952101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.952129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.952164 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:32 crc kubenswrapper[4778]: I0318 09:03:32.952191 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:32Z","lastTransitionTime":"2026-03-18T09:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.055401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.055469 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.055487 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.055510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.055529 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.159530 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.159600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.159620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.159646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.159665 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.263865 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.263934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.263953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.263985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.264004 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.368989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.369071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.369096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.369133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.369158 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.473840 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.473931 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.473951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.473980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.474002 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.577723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.577794 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.577813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.577843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.577863 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.681442 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.681524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.681542 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.681569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.681587 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.779904 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.780122 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:03:37.780092048 +0000 UTC m=+84.354836898 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.785409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.785486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.785504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.785529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.785547 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.881030 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.881098 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.881125 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.881150 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881161 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881287 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:37.881257648 +0000 UTC m=+84.456002508 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881316 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881362 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881395 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881417 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881372 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:37.8813558 +0000 UTC m=+84.456100660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881452 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881507 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:37.881474683 +0000 UTC m=+84.456219533 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881529 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881557 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:33 crc kubenswrapper[4778]: E0318 09:03:33.881653 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:37.881625307 +0000 UTC m=+84.456370177 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.888375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.888413 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.888426 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.888443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.888456 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.991169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.991263 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.991280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.991302 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:33 crc kubenswrapper[4778]: I0318 09:03:33.991319 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:33Z","lastTransitionTime":"2026-03-18T09:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.094075 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.094430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.094757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.094864 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.094961 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.187102 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.187251 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:34 crc kubenswrapper[4778]: E0318 09:03:34.187436 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.187572 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:34 crc kubenswrapper[4778]: E0318 09:03:34.187868 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:34 crc kubenswrapper[4778]: E0318 09:03:34.188082 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.197589 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.197894 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.198016 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.198089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.198150 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.204908 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.221399 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.237895 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.254302 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.269397 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.294530 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.300848 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.300910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.300928 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.300951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.300968 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.403762 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.403812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.403828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.403855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.403875 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.506483 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.506550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.506573 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.506600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.506621 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.609129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.609193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.609257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.609285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.609307 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.711637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.711690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.711708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.711730 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.711747 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.814454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.814509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.814532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.814560 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.814583 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.918114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.918189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.918281 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.918353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:34 crc kubenswrapper[4778]: I0318 09:03:34.918384 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:34Z","lastTransitionTime":"2026-03-18T09:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.021324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.021455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.021480 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.021510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.021532 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.124696 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.124750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.124762 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.124783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.124796 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.208967 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.227169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.227271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.227290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.227314 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.227335 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.330917 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.330971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.330989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.331011 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.331033 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.434866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.434955 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.434980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.435014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.435038 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.538299 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.538429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.538454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.538489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.538516 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.640695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.640741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.640753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.640770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.640783 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.743129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.743238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.743251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.743270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.743282 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.846474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.846543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.846570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.846601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.846623 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.950251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.950328 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.950351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.950383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:35 crc kubenswrapper[4778]: I0318 09:03:35.950406 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:35Z","lastTransitionTime":"2026-03-18T09:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.054511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.055015 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.055238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.055474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.055713 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.158828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.158873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.158882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.158925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.158939 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.186507 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.186597 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:36 crc kubenswrapper[4778]: E0318 09:03:36.186684 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.186825 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:36 crc kubenswrapper[4778]: E0318 09:03:36.186984 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:36 crc kubenswrapper[4778]: E0318 09:03:36.187290 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.261655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.261709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.261724 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.261742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.261755 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.364712 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.364782 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.364796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.364837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.364849 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.467344 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.467396 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.467404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.467418 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.467428 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.570080 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.570186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.570237 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.570264 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.570283 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.673269 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.673333 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.673353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.673380 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.673397 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.776339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.776398 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.776416 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.776440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.776457 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.881414 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.881723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.881817 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.881916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.882003 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.915327 4778 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.984709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.984741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.984753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.984768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:36 crc kubenswrapper[4778]: I0318 09:03:36.984779 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:36Z","lastTransitionTime":"2026-03-18T09:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.087633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.087701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.087728 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.087775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.087790 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.190775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.190846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.190867 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.190899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.190918 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.201653 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.202666 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.203017 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.294900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.294989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.295011 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.295042 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.295067 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.397661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.397732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.397754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.397783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.397806 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.500239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.500311 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.500329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.500355 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.500372 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.603372 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.603450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.603477 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.603514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.603539 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.604073 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.604475 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.706770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.706850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.706871 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.706902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.706924 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.809911 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.810357 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.810531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.810687 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.810831 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.815599 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.815900 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:03:45.81586917 +0000 UTC m=+92.390614040 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.914774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.914880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.914898 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.914922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.914940 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:37Z","lastTransitionTime":"2026-03-18T09:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.916361 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.916632 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.916542 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.916877 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.916986 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:45.916954438 +0000 UTC m=+92.491699318 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.916859 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917136 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917188 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917322 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917379 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:45.917350868 +0000 UTC m=+92.492095738 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917467 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:45.917453421 +0000 UTC m=+92.492198291 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: I0318 09:03:37.917118 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917579 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917601 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917618 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:37 crc kubenswrapper[4778]: E0318 09:03:37.917690 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:03:45.917667817 +0000 UTC m=+92.492412857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.018574 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.018679 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.018702 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.018725 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.018742 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.121754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.121805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.121817 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.121835 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.121847 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.187041 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:38 crc kubenswrapper[4778]: E0318 09:03:38.187259 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.187502 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.187577 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:38 crc kubenswrapper[4778]: E0318 09:03:38.187711 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:38 crc kubenswrapper[4778]: E0318 09:03:38.187893 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.224897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.224959 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.224983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.225008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.225027 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.328367 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.328689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.328878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.329046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.329222 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.432160 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.432304 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.432323 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.432346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.432366 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.535662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.535720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.535740 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.535763 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.535778 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.639174 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.639296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.639324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.639350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.639371 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.742876 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.742954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.742976 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.743006 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.743029 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.845448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.845491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.845500 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.845514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.845527 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.948613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.948695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.948723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.948757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:38 crc kubenswrapper[4778]: I0318 09:03:38.948780 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:38Z","lastTransitionTime":"2026-03-18T09:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.051604 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.051687 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.051705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.051738 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.051761 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.155621 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.155681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.155692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.155712 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.155725 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.258905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.258968 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.258978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.258995 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.259008 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.361643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.361717 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.361734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.361759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.361779 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.464250 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.464308 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.464326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.464352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.464370 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.567597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.567644 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.567654 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.567668 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.567678 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.669849 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.670118 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.670249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.670352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.670436 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.774026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.774089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.774107 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.774157 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.774174 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.877314 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.877607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.877683 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.877750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.877819 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.980698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.981166 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.981410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.981617 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:39 crc kubenswrapper[4778]: I0318 09:03:39.981825 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:39Z","lastTransitionTime":"2026-03-18T09:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.085467 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.086286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.086316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.086340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.086357 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.186497 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:40 crc kubenswrapper[4778]: E0318 09:03:40.186738 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.186505 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.186858 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:40 crc kubenswrapper[4778]: E0318 09:03:40.186909 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:40 crc kubenswrapper[4778]: E0318 09:03:40.188095 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.189142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.189270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.189296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.189375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.189451 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.292291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.292347 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.292364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.292387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.292404 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.395355 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.395409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.395429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.395453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.395472 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.497976 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.498025 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.498036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.498058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.498071 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.601559 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.601619 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.601635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.601662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.601677 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.704826 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.704885 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.704897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.704919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.704934 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.808155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.808277 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.808296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.808324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.808343 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.911409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.911471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.911483 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.911502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:40 crc kubenswrapper[4778]: I0318 09:03:40.911515 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:40Z","lastTransitionTime":"2026-03-18T09:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.014640 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.014689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.014699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.014718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.014728 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.122829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.122905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.122923 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.122951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.122968 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.226244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.226312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.226325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.226350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.226364 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.329872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.329955 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.329973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.330000 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.330033 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.432988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.433054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.433071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.433096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.433118 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.535825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.535880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.535896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.535920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.535937 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.639173 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.639283 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.639302 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.639327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.639347 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.742342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.742410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.742428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.742454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.742473 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.845121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.845183 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.845227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.845252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.845269 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.948098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.948145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.948162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.948185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:41 crc kubenswrapper[4778]: I0318 09:03:41.948236 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:41Z","lastTransitionTime":"2026-03-18T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.051270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.051343 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.051365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.051390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.051409 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.154327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.154371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.154379 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.154393 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.154404 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.186566 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.186645 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.186840 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.186860 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.187012 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.187178 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.256927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.257020 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.257045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.257071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.257091 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.360701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.360796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.360816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.360843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.360862 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.362678 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.362739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.362759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.362787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.362805 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.381125 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.386560 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.386617 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.386635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.386661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.386679 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.403359 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.407866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.407899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.407907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.407921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.407931 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.422731 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.427796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.427862 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.427881 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.427912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.427931 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.444103 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.448937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.449082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.449167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.449297 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.449403 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.461166 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:42 crc kubenswrapper[4778]: E0318 09:03:42.461906 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.464114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.464179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.464229 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.464257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.464275 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.567044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.567075 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.567082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.567095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.567106 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.674101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.674144 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.674154 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.674169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.674185 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.777701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.777779 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.777804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.777837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.777863 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.880704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.880760 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.880816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.880839 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.880855 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.983534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.983575 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.983587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.983605 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:42 crc kubenswrapper[4778]: I0318 09:03:42.983617 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:42Z","lastTransitionTime":"2026-03-18T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.086972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.087047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.087066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.087092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.087110 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: E0318 09:03:43.188679 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:43 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:03:43 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:43 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:03:43 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:03:43 crc kubenswrapper[4778]: else Mar 18 09:03:43 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:03:43 crc kubenswrapper[4778]: exit 1 Mar 18 09:03:43 crc kubenswrapper[4778]: fi Mar 18 09:03:43 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:03:43 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:43 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:43 crc kubenswrapper[4778]: E0318 09:03:43.189362 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:43 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:43 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:43 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:43 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:43 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:43 crc kubenswrapper[4778]: fi Mar 18 09:03:43 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:03:43 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:03:43 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:03:43 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:03:43 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:03:43 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:03:43 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:43 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:03:43 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:03:43 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:03:43 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:03:43 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:03:43 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:03:43 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:03:43 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:03:43 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:03:43 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:43 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:43 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.189516 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.189536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.189547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.189564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.189576 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: E0318 09:03:43.190360 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:03:43 crc kubenswrapper[4778]: E0318 09:03:43.191112 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:43 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:43 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:43 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:43 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:43 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:43 crc kubenswrapper[4778]: fi Mar 18 09:03:43 crc kubenswrapper[4778]: Mar 18 09:03:43 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:03:43 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:43 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:03:43 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:03:43 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:43 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:43 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:43 crc kubenswrapper[4778]: E0318 09:03:43.192341 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.291676 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.291720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.291731 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.291750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.291764 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.394788 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.394833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.394843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.394858 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.394870 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.497554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.497645 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.497666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.497692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.497711 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.599765 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.599844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.599862 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.599903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.599922 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.702692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.702758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.702777 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.702802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.702821 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.806165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.806260 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.806280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.806313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.806332 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.909424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.909513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.909536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.909566 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:43 crc kubenswrapper[4778]: I0318 09:03:43.909587 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:43Z","lastTransitionTime":"2026-03-18T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.012938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.013404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.013429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.013459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.013483 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.115587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.115645 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.115669 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.115697 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.115720 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.186609 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.186656 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:44 crc kubenswrapper[4778]: E0318 09:03:44.186777 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.186799 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:44 crc kubenswrapper[4778]: E0318 09:03:44.186986 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:44 crc kubenswrapper[4778]: E0318 09:03:44.187114 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:44 crc kubenswrapper[4778]: E0318 09:03:44.189303 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:03:44 crc kubenswrapper[4778]: E0318 09:03:44.190452 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.206821 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.218302 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.218642 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.218739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.218844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.218927 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.219562 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.231844 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.246944 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.262230 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.286473 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.298124 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.309992 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.322108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.322143 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.322157 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.322221 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.322235 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.425232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.425279 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.425288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.425305 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.425323 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.528024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.528066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.528074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.528088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.528097 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.630391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.630432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.630442 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.630457 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.630467 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.733564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.733629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.733646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.733670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.733687 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.836022 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.836073 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.836091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.836114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.836131 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.938757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.939099 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.939165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.939285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:44 crc kubenswrapper[4778]: I0318 09:03:44.939376 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:44Z","lastTransitionTime":"2026-03-18T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.042339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.042402 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.042420 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.042444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.042461 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.145383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.145446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.145463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.145485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.145509 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.248715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.248764 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.248774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.248792 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.248804 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.351526 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.351594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.351613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.351638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.351655 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.455359 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.455430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.455455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.455478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.455495 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.557711 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.557756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.557767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.557785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.557796 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.659878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.659933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.659950 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.659973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.659991 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.762293 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.762339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.762347 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.762361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.762370 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.864913 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.864946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.864954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.864967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.864976 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.896238 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.896477 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:04:01.896439463 +0000 UTC m=+108.471184303 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.968324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.968375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.968388 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.968406 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.968418 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:45Z","lastTransitionTime":"2026-03-18T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.996957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.997011 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.997044 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:45 crc kubenswrapper[4778]: I0318 09:03:45.997069 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997213 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997223 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997236 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997241 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997314 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:01.997291474 +0000 UTC m=+108.572036394 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997251 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997249 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997420 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:01.997391237 +0000 UTC m=+108.572136087 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997314 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997450 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997512 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:01.99749567 +0000 UTC m=+108.572240520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:03:45 crc kubenswrapper[4778]: E0318 09:03:45.997544 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:01.9975219 +0000 UTC m=+108.572266780 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.071701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.071813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.071843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.071883 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.071932 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.174575 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.174656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.174668 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.174695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.174710 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.186869 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.186986 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.186924 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:46 crc kubenswrapper[4778]: E0318 09:03:46.187052 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:46 crc kubenswrapper[4778]: E0318 09:03:46.187148 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:46 crc kubenswrapper[4778]: E0318 09:03:46.187467 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.277432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.277476 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.277494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.277516 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.277535 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.380675 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.380751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.380774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.380803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.380838 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.483136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.483242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.483267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.483296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.483318 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.586348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.586459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.586478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.586510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.586532 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.689921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.689988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.690007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.690036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.690051 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.792705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.792741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.792752 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.792769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.792781 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.895586 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.895625 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.895637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.895653 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.895667 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.998567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.998621 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.998639 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.998653 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:46 crc kubenswrapper[4778]: I0318 09:03:46.998662 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:46Z","lastTransitionTime":"2026-03-18T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.101480 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.101587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.101606 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.101629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.101643 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.205337 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.205404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.205423 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.205453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.205473 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.307633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.307672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.307727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.307746 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.307760 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.351577 4778 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.410274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.410607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.410699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.410846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.410942 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.513634 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.513941 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.514102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.514296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.514443 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.617440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.617506 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.617522 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.617545 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.617563 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.720918 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.721478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.721681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.721865 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.722030 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.825558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.825603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.825616 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.825641 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.825653 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.928622 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.928699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.928718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.928747 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:47 crc kubenswrapper[4778]: I0318 09:03:47.928766 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:47Z","lastTransitionTime":"2026-03-18T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.031676 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.032045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.032274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.032460 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.032609 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.135821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.135874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.135884 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.135904 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.135915 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.186497 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.186570 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.186700 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:48 crc kubenswrapper[4778]: E0318 09:03:48.186688 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:48 crc kubenswrapper[4778]: E0318 09:03:48.187007 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:48 crc kubenswrapper[4778]: E0318 09:03:48.187077 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.240089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.240168 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.240187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.240248 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.240268 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.344291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.344377 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.344397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.344433 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.344454 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.448092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.448236 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.448257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.448289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.448308 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.551813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.551863 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.551879 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.551904 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.551920 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.654531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.654590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.654607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.654631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.654648 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.757189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.757275 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.757306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.757330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.757349 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.861269 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.861348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.861365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.861391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.861410 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.963708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.963750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.963761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.963781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:48 crc kubenswrapper[4778]: I0318 09:03:48.963792 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:48Z","lastTransitionTime":"2026-03-18T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.067328 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.067424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.067449 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.067479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.067497 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.170064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.170147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.170162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.170181 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.170209 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.273146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.273249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.273270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.273297 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.273318 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.375972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.376059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.376077 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.376112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.376132 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.478904 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.478982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.479004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.479028 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.479046 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.582814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.582878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.582901 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.582925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.582943 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.685744 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.685795 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.685807 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.685824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.685837 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.788870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.788941 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.788957 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.788987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.789012 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.891327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.891403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.891420 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.891445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.891463 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.994145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.994185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.994217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.994237 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:49 crc kubenswrapper[4778]: I0318 09:03:49.994251 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:49Z","lastTransitionTime":"2026-03-18T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.097401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.097442 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.097454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.097472 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.097485 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.186779 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.186895 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:50 crc kubenswrapper[4778]: E0318 09:03:50.186984 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.186998 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:50 crc kubenswrapper[4778]: E0318 09:03:50.187136 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:50 crc kubenswrapper[4778]: E0318 09:03:50.187269 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.200853 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.200923 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.200943 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.200966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.200984 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.303043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.303086 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.303105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.303128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.303145 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.405651 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.405709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.405723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.405745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.405760 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.510129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.510391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.510723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.511026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.511396 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.614187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.614571 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.614715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.614847 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.614971 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.717855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.718103 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.718353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.718566 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.718767 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.821760 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.821847 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.821870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.821902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.821924 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.928569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.928629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.928645 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.928669 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:50 crc kubenswrapper[4778]: I0318 09:03:50.928711 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:50Z","lastTransitionTime":"2026-03-18T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.032220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.032263 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.032276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.032295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.032308 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.135689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.135747 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.135773 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.135800 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.135820 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.239255 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.239320 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.239342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.239371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.239392 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.342536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.342953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.343159 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.343431 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.343604 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.447012 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.447074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.447091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.447119 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.447136 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.550743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.551046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.551107 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.551167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.551266 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.654581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.654638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.654655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.654676 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.654694 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.758189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.758266 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.758279 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.758303 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.758317 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.861681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.862115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.862316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.862500 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.862680 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.966329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.966402 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.966421 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.966446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:51 crc kubenswrapper[4778]: I0318 09:03:51.966463 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:51Z","lastTransitionTime":"2026-03-18T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.069491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.069582 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.069595 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.069612 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.069628 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.173019 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.173064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.173079 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.173102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.173115 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.186619 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.186700 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.186862 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.187016 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.187171 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.187289 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.187474 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.187796 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.276461 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.276518 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.276541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.276569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.276592 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.379755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.379802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.379812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.379828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.379837 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.483005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.483092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.483111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.483668 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.483723 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.587634 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.587711 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.587734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.587763 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.587785 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.625279 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.625320 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.625332 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.625349 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.625362 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.640707 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.645554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.645594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.645605 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.645622 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.645634 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.659774 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.663954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.663983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.663991 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.664005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.664013 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.674875 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.679479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.679511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.679520 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.679534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.679545 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.693077 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.697499 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.697564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.697587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.697617 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.697639 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.711740 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:52 crc kubenswrapper[4778]: E0318 09:03:52.711980 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.714454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.714503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.714521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.714543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.714560 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.817588 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.817658 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.817682 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.817710 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.817732 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.920662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.920690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.920700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.920714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:52 crc kubenswrapper[4778]: I0318 09:03:52.920725 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:52Z","lastTransitionTime":"2026-03-18T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.023856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.023933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.023956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.023988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.024013 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.127727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.127879 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.127905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.127935 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.127958 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.231327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.231396 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.231419 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.231447 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.231469 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.334601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.334736 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.334767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.334809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.334849 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.438366 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.438436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.438462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.438493 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.438516 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.542239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.542318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.542339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.542365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.542386 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.644652 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.644689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.644697 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.644710 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.644738 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.747450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.747714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.747927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.748099 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.748337 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.853030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.853114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.853137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.853169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.853233 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.956618 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.956690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.956714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.956745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:53 crc kubenswrapper[4778]: I0318 09:03:53.956768 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:53Z","lastTransitionTime":"2026-03-18T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.060410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.060495 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.060521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.060552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.060570 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.164060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.164141 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.164167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.164220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.164306 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.187141 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.187237 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:54 crc kubenswrapper[4778]: E0318 09:03:54.187339 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:54 crc kubenswrapper[4778]: E0318 09:03:54.187440 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.187034 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:54 crc kubenswrapper[4778]: E0318 09:03:54.187918 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.205227 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.218268 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.230962 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.245384 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.261492 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.267389 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.267444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.267463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.267484 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.267499 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.282530 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.302983 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.313904 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.370940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.371004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.371033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.371063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.371086 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.474373 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.474440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.474462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.474527 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.474550 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.577267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.577333 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.577348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.577368 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.577383 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.679843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.679899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.679910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.679930 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.679943 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.782936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.783372 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.783502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.783657 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.783767 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.886142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.886185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.886215 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.886235 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.886247 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.989350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.989414 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.989432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.989457 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:54 crc kubenswrapper[4778]: I0318 09:03:54.989476 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:54Z","lastTransitionTime":"2026-03-18T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.092633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.092695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.092714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.092743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.092761 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: E0318 09:03:55.189607 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:03:55 crc kubenswrapper[4778]: E0318 09:03:55.189712 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:55 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:03:55 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:55 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:03:55 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:03:55 crc kubenswrapper[4778]: else Mar 18 09:03:55 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:03:55 crc kubenswrapper[4778]: exit 1 Mar 18 09:03:55 crc kubenswrapper[4778]: fi Mar 18 09:03:55 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:03:55 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:55 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:55 crc kubenswrapper[4778]: E0318 09:03:55.190809 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:03:55 crc kubenswrapper[4778]: E0318 09:03:55.191009 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.194713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.194796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.194816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.194839 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.194857 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.297720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.297771 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.297783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.297803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.297818 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.400821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.400894 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.400905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.400920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.400930 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.504119 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.504248 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.504276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.504313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.504337 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.607351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.607439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.607458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.607486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.607507 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.711501 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.711861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.712059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.712374 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.712590 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.816246 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.816312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.816327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.816350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.816365 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.920023 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.920087 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.920103 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.920130 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:55 crc kubenswrapper[4778]: I0318 09:03:55.920148 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:55Z","lastTransitionTime":"2026-03-18T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.022937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.022992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.023010 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.023033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.023050 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.126650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.126726 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.126746 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.126772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.126790 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.186987 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.187770 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.187830 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.187899 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.188383 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.188612 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.189764 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:56 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:56 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:56 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:56 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:56 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:56 crc kubenswrapper[4778]: fi Mar 18 09:03:56 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:03:56 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:03:56 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:03:56 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:03:56 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:03:56 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:03:56 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:56 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:03:56 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:03:56 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:03:56 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:03:56 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:03:56 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:03:56 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:03:56 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:03:56 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:03:56 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:56 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:56 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.192505 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:56 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:03:56 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:03:56 crc kubenswrapper[4778]: set -o allexport Mar 18 09:03:56 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:03:56 crc kubenswrapper[4778]: set +o allexport Mar 18 09:03:56 crc kubenswrapper[4778]: fi Mar 18 09:03:56 crc kubenswrapper[4778]: Mar 18 09:03:56 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:03:56 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:03:56 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:03:56 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:03:56 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:03:56 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:56 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:56 crc kubenswrapper[4778]: E0318 09:03:56.194046 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.229640 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.229698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.229715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.229740 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.229757 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.333633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.333716 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.333733 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.333759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.333778 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.437264 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.437340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.437365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.437404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.437426 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.540482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.540563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.540587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.540620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.540645 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.643855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.643902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.643919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.643944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.643966 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.746855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.746945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.746964 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.746990 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.747009 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.850833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.850895 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.850912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.850935 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.850956 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.954289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.954371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.954389 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.954413 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:56 crc kubenswrapper[4778]: I0318 09:03:56.954432 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:56Z","lastTransitionTime":"2026-03-18T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.056680 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.056739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.056756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.056781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.056798 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.159567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.159655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.159904 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.159940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.159964 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.263451 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.263517 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.263534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.263597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.263615 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.367146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.367258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.367277 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.367300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.367319 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.474628 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.474803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.475537 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.475590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.475612 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.578828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.578912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.578946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.578978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.579000 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.682175 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.682242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.682251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.682267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.682281 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.786612 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.786660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.786669 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.786686 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.786698 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.890774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.890829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.890837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.890855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.890866 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.993607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.993678 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.993704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.993742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:57 crc kubenswrapper[4778]: I0318 09:03:57.993771 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:57Z","lastTransitionTime":"2026-03-18T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.096800 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.096859 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.096873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.096892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.096905 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.186768 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.186765 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.186765 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:03:58 crc kubenswrapper[4778]: E0318 09:03:58.187655 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:03:58 crc kubenswrapper[4778]: E0318 09:03:58.187801 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:03:58 crc kubenswrapper[4778]: E0318 09:03:58.188090 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.200405 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.200458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.200471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.200493 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.200508 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.303915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.304004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.304023 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.304045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.304062 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.407014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.407121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.407148 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.407177 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.407232 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.510946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.511014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.511034 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.511059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.511079 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.614400 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.614460 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.614478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.614503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.614522 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.718184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.718288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.718306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.718331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.718348 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.822054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.822164 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.822187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.822257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.822322 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.927063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.927168 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.927232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.927323 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:58 crc kubenswrapper[4778]: I0318 09:03:58.927386 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:58Z","lastTransitionTime":"2026-03-18T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.030671 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.030731 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.030750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.030772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.030791 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.133983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.134048 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.134065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.134088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.134108 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.237124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.237227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.237255 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.237286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.237311 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.340823 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.340890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.340910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.340939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.340959 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.400473 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-dfnnp"] Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.401006 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.405753 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.406696 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.409049 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.425315 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9b2b\" (UniqueName: \"kubernetes.io/projected/8cf64307-e191-476a-902b-93001adc0b16-kube-api-access-f9b2b\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.425372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8cf64307-e191-476a-902b-93001adc0b16-hosts-file\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.438359 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.443492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.443536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.443549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.443579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.443595 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.463776 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.486880 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.506312 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.517889 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.526625 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8cf64307-e191-476a-902b-93001adc0b16-hosts-file\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.526679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9b2b\" (UniqueName: \"kubernetes.io/projected/8cf64307-e191-476a-902b-93001adc0b16-kube-api-access-f9b2b\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.526826 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8cf64307-e191-476a-902b-93001adc0b16-hosts-file\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.540350 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.546629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.546690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.546708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.546734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.546753 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.550785 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9b2b\" (UniqueName: \"kubernetes.io/projected/8cf64307-e191-476a-902b-93001adc0b16-kube-api-access-f9b2b\") pod \"node-resolver-dfnnp\" (UID: \"8cf64307-e191-476a-902b-93001adc0b16\") " pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.555986 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.570652 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.583599 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.656706 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.656775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.656795 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.656821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.656840 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.727263 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dfnnp" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.760912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.760966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.760983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.761007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.761024 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: W0318 09:03:59.771952 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cf64307_e191_476a_902b_93001adc0b16.slice/crio-3fe873f767a719bf5f4f5ff009d2646ec2788a92e8356376e715112f3ab6128c WatchSource:0}: Error finding container 3fe873f767a719bf5f4f5ff009d2646ec2788a92e8356376e715112f3ab6128c: Status 404 returned error can't find the container with id 3fe873f767a719bf5f4f5ff009d2646ec2788a92e8356376e715112f3ab6128c Mar 18 09:03:59 crc kubenswrapper[4778]: E0318 09:03:59.775727 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:03:59 crc kubenswrapper[4778]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 09:03:59 crc kubenswrapper[4778]: set -uo pipefail Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 09:03:59 crc kubenswrapper[4778]: HOSTS_FILE="/etc/hosts" Mar 18 09:03:59 crc kubenswrapper[4778]: TEMP_FILE="/etc/hosts.tmp" Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: # Make a temporary file with the old hosts file's attributes. Mar 18 09:03:59 crc kubenswrapper[4778]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 09:03:59 crc kubenswrapper[4778]: echo "Failed to preserve hosts file. Exiting." Mar 18 09:03:59 crc kubenswrapper[4778]: exit 1 Mar 18 09:03:59 crc kubenswrapper[4778]: fi Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: while true; do Mar 18 09:03:59 crc kubenswrapper[4778]: declare -A svc_ips Mar 18 09:03:59 crc kubenswrapper[4778]: for svc in "${services[@]}"; do Mar 18 09:03:59 crc kubenswrapper[4778]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 09:03:59 crc kubenswrapper[4778]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 09:03:59 crc kubenswrapper[4778]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 09:03:59 crc kubenswrapper[4778]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 09:03:59 crc kubenswrapper[4778]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:03:59 crc kubenswrapper[4778]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:03:59 crc kubenswrapper[4778]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:03:59 crc kubenswrapper[4778]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 09:03:59 crc kubenswrapper[4778]: for i in ${!cmds[*]} Mar 18 09:03:59 crc kubenswrapper[4778]: do Mar 18 09:03:59 crc kubenswrapper[4778]: ips=($(eval "${cmds[i]}")) Mar 18 09:03:59 crc kubenswrapper[4778]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 09:03:59 crc kubenswrapper[4778]: svc_ips["${svc}"]="${ips[@]}" Mar 18 09:03:59 crc kubenswrapper[4778]: break Mar 18 09:03:59 crc kubenswrapper[4778]: fi Mar 18 09:03:59 crc kubenswrapper[4778]: done Mar 18 09:03:59 crc kubenswrapper[4778]: done Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: # Update /etc/hosts only if we get valid service IPs Mar 18 09:03:59 crc kubenswrapper[4778]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 09:03:59 crc kubenswrapper[4778]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 09:03:59 crc kubenswrapper[4778]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 09:03:59 crc kubenswrapper[4778]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 09:03:59 crc kubenswrapper[4778]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 09:03:59 crc kubenswrapper[4778]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 09:03:59 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:03:59 crc kubenswrapper[4778]: continue Mar 18 09:03:59 crc kubenswrapper[4778]: fi Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: # Append resolver entries for services Mar 18 09:03:59 crc kubenswrapper[4778]: rc=0 Mar 18 09:03:59 crc kubenswrapper[4778]: for svc in "${!svc_ips[@]}"; do Mar 18 09:03:59 crc kubenswrapper[4778]: for ip in ${svc_ips[${svc}]}; do Mar 18 09:03:59 crc kubenswrapper[4778]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 09:03:59 crc kubenswrapper[4778]: done Mar 18 09:03:59 crc kubenswrapper[4778]: done Mar 18 09:03:59 crc kubenswrapper[4778]: if [[ $rc -ne 0 ]]; then Mar 18 09:03:59 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:03:59 crc kubenswrapper[4778]: continue Mar 18 09:03:59 crc kubenswrapper[4778]: fi Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: Mar 18 09:03:59 crc kubenswrapper[4778]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 09:03:59 crc kubenswrapper[4778]: # Replace /etc/hosts with our modified version if needed Mar 18 09:03:59 crc kubenswrapper[4778]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 09:03:59 crc kubenswrapper[4778]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 09:03:59 crc kubenswrapper[4778]: fi Mar 18 09:03:59 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:03:59 crc kubenswrapper[4778]: unset svc_ips Mar 18 09:03:59 crc kubenswrapper[4778]: done Mar 18 09:03:59 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f9b2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-dfnnp_openshift-dns(8cf64307-e191-476a-902b-93001adc0b16): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:03:59 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:03:59 crc kubenswrapper[4778]: E0318 09:03:59.776992 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-dfnnp" podUID="8cf64307-e191-476a-902b-93001adc0b16" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.795856 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-r2lvf"] Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.796602 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.797939 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xkfx8"] Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.799077 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-56rc7"] Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.799397 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.799673 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.799785 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.800997 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.801070 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.801191 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.802008 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.802161 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.807434 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.807476 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.807499 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.807596 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.807600 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.808024 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.823170 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829715 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjjcr\" (UniqueName: \"kubernetes.io/projected/7243f983-24d5-48ef-858b-5f4049a82acc-kube-api-access-gjjcr\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829779 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-cni-binary-copy\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829816 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-k8s-cni-cncf-io\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829867 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-os-release\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829899 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7243f983-24d5-48ef-858b-5f4049a82acc-proxy-tls\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829933 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-cnibin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829961 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-hostroot\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.829990 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-multus-certs\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7243f983-24d5-48ef-858b-5f4049a82acc-rootfs\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2zsl\" (UniqueName: \"kubernetes.io/projected/dce973f3-25e6-4536-87cc-9b46499ad7cf-kube-api-access-x2zsl\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830119 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-conf-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830167 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-multus\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830252 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830309 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-os-release\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cxhf\" (UniqueName: \"kubernetes.io/projected/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-kube-api-access-4cxhf\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830466 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-netns\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830516 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830580 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cnibin\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830637 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830681 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-socket-dir-parent\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830723 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-etc-kubernetes\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830757 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-system-cni-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830787 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830818 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7243f983-24d5-48ef-858b-5f4049a82acc-mcd-auth-proxy-config\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830863 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-daemon-config\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830914 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-system-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830951 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-bin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.830983 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-kubelet\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.841633 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.857415 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.863555 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.863608 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.863623 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.863645 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.863661 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.874341 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.887902 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.905288 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.924882 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.931779 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-system-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.931864 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-bin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.931904 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-kubelet\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.931982 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjjcr\" (UniqueName: \"kubernetes.io/projected/7243f983-24d5-48ef-858b-5f4049a82acc-kube-api-access-gjjcr\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932023 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-cni-binary-copy\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932023 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-bin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932067 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-k8s-cni-cncf-io\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932110 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-kubelet\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932153 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-os-release\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932155 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-k8s-cni-cncf-io\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932022 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-system-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932186 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7243f983-24d5-48ef-858b-5f4049a82acc-proxy-tls\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932362 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-cnibin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932390 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-hostroot\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932433 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-multus-certs\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932454 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-cnibin\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932532 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-os-release\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932565 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-multus-certs\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932532 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-hostroot\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932581 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7243f983-24d5-48ef-858b-5f4049a82acc-rootfs\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932684 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7243f983-24d5-48ef-858b-5f4049a82acc-rootfs\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932685 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2zsl\" (UniqueName: \"kubernetes.io/projected/dce973f3-25e6-4536-87cc-9b46499ad7cf-kube-api-access-x2zsl\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932730 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-conf-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932774 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-multus\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932802 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-conf-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932814 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932837 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-var-lib-cni-multus\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932854 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-os-release\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932896 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cxhf\" (UniqueName: \"kubernetes.io/projected/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-kube-api-access-4cxhf\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932930 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-netns\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932964 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.932993 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-os-release\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933002 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cnibin\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933040 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cnibin\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933068 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933081 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-host-run-netns\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933093 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-socket-dir-parent\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-cni-dir\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933129 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-socket-dir-parent\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933168 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-etc-kubernetes\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933241 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dce973f3-25e6-4536-87cc-9b46499ad7cf-etc-kubernetes\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933248 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-system-cni-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933292 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933334 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7243f983-24d5-48ef-858b-5f4049a82acc-mcd-auth-proxy-config\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933355 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-system-cni-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.933378 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-daemon-config\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.934261 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-cni-binary-copy\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.934414 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.934531 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dce973f3-25e6-4536-87cc-9b46499ad7cf-multus-daemon-config\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.934577 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7243f983-24d5-48ef-858b-5f4049a82acc-mcd-auth-proxy-config\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.935146 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.935170 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.937787 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.938370 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7243f983-24d5-48ef-858b-5f4049a82acc-proxy-tls\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.950146 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjjcr\" (UniqueName: \"kubernetes.io/projected/7243f983-24d5-48ef-858b-5f4049a82acc-kube-api-access-gjjcr\") pod \"machine-config-daemon-56rc7\" (UID: \"7243f983-24d5-48ef-858b-5f4049a82acc\") " pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.952123 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.958458 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2zsl\" (UniqueName: \"kubernetes.io/projected/dce973f3-25e6-4536-87cc-9b46499ad7cf-kube-api-access-x2zsl\") pod \"multus-r2lvf\" (UID: \"dce973f3-25e6-4536-87cc-9b46499ad7cf\") " pod="openshift-multus/multus-r2lvf" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.962408 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cxhf\" (UniqueName: \"kubernetes.io/projected/b1698c21-24a7-4338-a0ad-dd110c1ba2f2-kube-api-access-4cxhf\") pod \"multus-additional-cni-plugins-xkfx8\" (UID: \"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\") " pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.966557 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.966597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.966613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.966638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.966654 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:03:59Z","lastTransitionTime":"2026-03-18T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.967723 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.983049 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:03:59 crc kubenswrapper[4778]: I0318 09:03:59.998377 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.010779 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.026102 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.046766 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.062741 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.070090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.070152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.070167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.070188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.070224 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.076916 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.095707 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.114463 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.127464 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.132044 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r2lvf" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.144424 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.146668 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:04:00 crc kubenswrapper[4778]: W0318 09:04:00.158361 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddce973f3_25e6_4536_87cc_9b46499ad7cf.slice/crio-2b56e0b581b82099bec21992e25b801c274f15b004cee836d238158d8adaa942 WatchSource:0}: Error finding container 2b56e0b581b82099bec21992e25b801c274f15b004cee836d238158d8adaa942: Status 404 returned error can't find the container with id 2b56e0b581b82099bec21992e25b801c274f15b004cee836d238158d8adaa942 Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.159515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.160505 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.170488 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:00 crc kubenswrapper[4778]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 09:04:00 crc kubenswrapper[4778]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 09:04:00 crc kubenswrapper[4778]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2zsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:00 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.172106 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.173058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.173160 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.173176 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.173208 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.173219 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.176288 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.176470 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g2qth"] Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.178430 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.179747 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.180547 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.180974 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.181451 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.181719 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.181740 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.182150 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.182393 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.182482 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 09:04:00 crc kubenswrapper[4778]: W0318 09:04:00.184663 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1698c21_24a7_4338_a0ad_dd110c1ba2f2.slice/crio-9ab054141923d09f0e8931cd7a26bdf8d6372005e8578ff1891fddf501c4d056 WatchSource:0}: Error finding container 9ab054141923d09f0e8931cd7a26bdf8d6372005e8578ff1891fddf501c4d056: Status 404 returned error can't find the container with id 9ab054141923d09f0e8931cd7a26bdf8d6372005e8578ff1891fddf501c4d056 Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.186222 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.186238 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.186339 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.186410 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.186510 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.186612 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.188070 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cxhf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-xkfx8_openshift-multus(b1698c21-24a7-4338-a0ad-dd110c1ba2f2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.189669 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" podUID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.192561 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.203971 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.218325 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.228822 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.236675 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.236743 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.236774 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.236800 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8g6f\" (UniqueName: \"kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.236926 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237020 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237122 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237139 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237237 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237264 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237283 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237364 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237443 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237463 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237728 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237752 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.237784 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.239397 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.250656 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.263193 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.276074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.276116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.276125 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.276142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.276154 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.279799 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.298219 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.316447 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.330231 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.338879 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.338985 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339024 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339056 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339098 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339134 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339248 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339284 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339319 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339337 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339430 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339354 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339506 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339518 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339547 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339574 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339607 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339621 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339641 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339658 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339671 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339703 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339777 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339812 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8g6f\" (UniqueName: \"kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339852 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339938 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.339990 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340037 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340083 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340086 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340838 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.340995 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.341060 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.341505 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.341604 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.341800 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.346053 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.360676 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.363002 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8g6f\" (UniqueName: \"kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f\") pod \"ovnkube-node-g2qth\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.378893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.378949 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.378958 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.378982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.378994 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.482926 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.483013 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.483037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.483064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.483083 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.501710 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:00 crc kubenswrapper[4778]: W0318 09:04:00.520167 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef97d63e_1caf_44c9_ac0c_9b03dbd05113.slice/crio-0662ffa0aa3c3fc99a955a63b995acc6a492d7cf6b911968c3e8039e26cdcb7f WatchSource:0}: Error finding container 0662ffa0aa3c3fc99a955a63b995acc6a492d7cf6b911968c3e8039e26cdcb7f: Status 404 returned error can't find the container with id 0662ffa0aa3c3fc99a955a63b995acc6a492d7cf6b911968c3e8039e26cdcb7f Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.524141 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:00 crc kubenswrapper[4778]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 09:04:00 crc kubenswrapper[4778]: apiVersion: v1 Mar 18 09:04:00 crc kubenswrapper[4778]: clusters: Mar 18 09:04:00 crc kubenswrapper[4778]: - cluster: Mar 18 09:04:00 crc kubenswrapper[4778]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 09:04:00 crc kubenswrapper[4778]: server: https://api-int.crc.testing:6443 Mar 18 09:04:00 crc kubenswrapper[4778]: name: default-cluster Mar 18 09:04:00 crc kubenswrapper[4778]: contexts: Mar 18 09:04:00 crc kubenswrapper[4778]: - context: Mar 18 09:04:00 crc kubenswrapper[4778]: cluster: default-cluster Mar 18 09:04:00 crc kubenswrapper[4778]: namespace: default Mar 18 09:04:00 crc kubenswrapper[4778]: user: default-auth Mar 18 09:04:00 crc kubenswrapper[4778]: name: default-context Mar 18 09:04:00 crc kubenswrapper[4778]: current-context: default-context Mar 18 09:04:00 crc kubenswrapper[4778]: kind: Config Mar 18 09:04:00 crc kubenswrapper[4778]: preferences: {} Mar 18 09:04:00 crc kubenswrapper[4778]: users: Mar 18 09:04:00 crc kubenswrapper[4778]: - name: default-auth Mar 18 09:04:00 crc kubenswrapper[4778]: user: Mar 18 09:04:00 crc kubenswrapper[4778]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:00 crc kubenswrapper[4778]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:00 crc kubenswrapper[4778]: EOF Mar 18 09:04:00 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8g6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:00 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.526261 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.589078 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.589153 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.589175 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.589234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.589269 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.669875 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"0662ffa0aa3c3fc99a955a63b995acc6a492d7cf6b911968c3e8039e26cdcb7f"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.672721 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerStarted","Data":"9ab054141923d09f0e8931cd7a26bdf8d6372005e8578ff1891fddf501c4d056"} Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.672940 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:00 crc kubenswrapper[4778]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 09:04:00 crc kubenswrapper[4778]: apiVersion: v1 Mar 18 09:04:00 crc kubenswrapper[4778]: clusters: Mar 18 09:04:00 crc kubenswrapper[4778]: - cluster: Mar 18 09:04:00 crc kubenswrapper[4778]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 09:04:00 crc kubenswrapper[4778]: server: https://api-int.crc.testing:6443 Mar 18 09:04:00 crc kubenswrapper[4778]: name: default-cluster Mar 18 09:04:00 crc kubenswrapper[4778]: contexts: Mar 18 09:04:00 crc kubenswrapper[4778]: - context: Mar 18 09:04:00 crc kubenswrapper[4778]: cluster: default-cluster Mar 18 09:04:00 crc kubenswrapper[4778]: namespace: default Mar 18 09:04:00 crc kubenswrapper[4778]: user: default-auth Mar 18 09:04:00 crc kubenswrapper[4778]: name: default-context Mar 18 09:04:00 crc kubenswrapper[4778]: current-context: default-context Mar 18 09:04:00 crc kubenswrapper[4778]: kind: Config Mar 18 09:04:00 crc kubenswrapper[4778]: preferences: {} Mar 18 09:04:00 crc kubenswrapper[4778]: users: Mar 18 09:04:00 crc kubenswrapper[4778]: - name: default-auth Mar 18 09:04:00 crc kubenswrapper[4778]: user: Mar 18 09:04:00 crc kubenswrapper[4778]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:00 crc kubenswrapper[4778]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:00 crc kubenswrapper[4778]: EOF Mar 18 09:04:00 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8g6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:00 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.675531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"8c20d90ff1890fad7b9e40ab7f878094324a004a1781e12dcf95fc402ccd00c9"} Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.676229 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.677143 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cxhf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-xkfx8_openshift-multus(b1698c21-24a7-4338-a0ad-dd110c1ba2f2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.678401 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" podUID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.679631 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerStarted","Data":"2b56e0b581b82099bec21992e25b801c274f15b004cee836d238158d8adaa942"} Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.681715 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:00 crc kubenswrapper[4778]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 09:04:00 crc kubenswrapper[4778]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 09:04:00 crc kubenswrapper[4778]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2zsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:00 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.681937 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.683518 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.684129 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dfnnp" event={"ID":"8cf64307-e191-476a-902b-93001adc0b16","Type":"ContainerStarted","Data":"3fe873f767a719bf5f4f5ff009d2646ec2788a92e8356376e715112f3ab6128c"} Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.684876 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.686687 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.687661 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:00 crc kubenswrapper[4778]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:00 crc kubenswrapper[4778]: set -uo pipefail Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 09:04:00 crc kubenswrapper[4778]: HOSTS_FILE="/etc/hosts" Mar 18 09:04:00 crc kubenswrapper[4778]: TEMP_FILE="/etc/hosts.tmp" Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: # Make a temporary file with the old hosts file's attributes. Mar 18 09:04:00 crc kubenswrapper[4778]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 09:04:00 crc kubenswrapper[4778]: echo "Failed to preserve hosts file. Exiting." Mar 18 09:04:00 crc kubenswrapper[4778]: exit 1 Mar 18 09:04:00 crc kubenswrapper[4778]: fi Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: while true; do Mar 18 09:04:00 crc kubenswrapper[4778]: declare -A svc_ips Mar 18 09:04:00 crc kubenswrapper[4778]: for svc in "${services[@]}"; do Mar 18 09:04:00 crc kubenswrapper[4778]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 09:04:00 crc kubenswrapper[4778]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 09:04:00 crc kubenswrapper[4778]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 09:04:00 crc kubenswrapper[4778]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 09:04:00 crc kubenswrapper[4778]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:00 crc kubenswrapper[4778]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:00 crc kubenswrapper[4778]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:00 crc kubenswrapper[4778]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 09:04:00 crc kubenswrapper[4778]: for i in ${!cmds[*]} Mar 18 09:04:00 crc kubenswrapper[4778]: do Mar 18 09:04:00 crc kubenswrapper[4778]: ips=($(eval "${cmds[i]}")) Mar 18 09:04:00 crc kubenswrapper[4778]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 09:04:00 crc kubenswrapper[4778]: svc_ips["${svc}"]="${ips[@]}" Mar 18 09:04:00 crc kubenswrapper[4778]: break Mar 18 09:04:00 crc kubenswrapper[4778]: fi Mar 18 09:04:00 crc kubenswrapper[4778]: done Mar 18 09:04:00 crc kubenswrapper[4778]: done Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: # Update /etc/hosts only if we get valid service IPs Mar 18 09:04:00 crc kubenswrapper[4778]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 09:04:00 crc kubenswrapper[4778]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 09:04:00 crc kubenswrapper[4778]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 09:04:00 crc kubenswrapper[4778]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 09:04:00 crc kubenswrapper[4778]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 09:04:00 crc kubenswrapper[4778]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 09:04:00 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:00 crc kubenswrapper[4778]: continue Mar 18 09:04:00 crc kubenswrapper[4778]: fi Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: # Append resolver entries for services Mar 18 09:04:00 crc kubenswrapper[4778]: rc=0 Mar 18 09:04:00 crc kubenswrapper[4778]: for svc in "${!svc_ips[@]}"; do Mar 18 09:04:00 crc kubenswrapper[4778]: for ip in ${svc_ips[${svc}]}; do Mar 18 09:04:00 crc kubenswrapper[4778]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 09:04:00 crc kubenswrapper[4778]: done Mar 18 09:04:00 crc kubenswrapper[4778]: done Mar 18 09:04:00 crc kubenswrapper[4778]: if [[ $rc -ne 0 ]]; then Mar 18 09:04:00 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:00 crc kubenswrapper[4778]: continue Mar 18 09:04:00 crc kubenswrapper[4778]: fi Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: Mar 18 09:04:00 crc kubenswrapper[4778]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 09:04:00 crc kubenswrapper[4778]: # Replace /etc/hosts with our modified version if needed Mar 18 09:04:00 crc kubenswrapper[4778]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 09:04:00 crc kubenswrapper[4778]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 09:04:00 crc kubenswrapper[4778]: fi Mar 18 09:04:00 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:00 crc kubenswrapper[4778]: unset svc_ips Mar 18 09:04:00 crc kubenswrapper[4778]: done Mar 18 09:04:00 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f9b2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-dfnnp_openshift-dns(8cf64307-e191-476a-902b-93001adc0b16): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:00 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:00 crc kubenswrapper[4778]: E0318 09:04:00.688862 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-dfnnp" podUID="8cf64307-e191-476a-902b-93001adc0b16" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.691836 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.692715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.692803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.692831 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.692864 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.692926 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.707855 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.722977 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.741861 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.758817 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.775119 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.788703 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.795863 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.795915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.795929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.795952 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.795966 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.815945 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.834490 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.859660 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.877755 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.892921 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.899556 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.899631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.899647 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.899669 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.899687 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:00Z","lastTransitionTime":"2026-03-18T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.913538 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.936306 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.954060 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.968933 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4778]: I0318 09:04:00.995755 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.002916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.002980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.003001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.003031 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.003051 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.008947 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.024757 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.038516 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.050313 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.060602 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.072832 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.087281 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.098952 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.107898 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.107949 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.107960 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.107979 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.107992 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.116168 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.215703 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.215823 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.215883 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.215912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.215932 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.319989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.320052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.320071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.320096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.320118 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.423657 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.423738 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.423756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.423781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.423799 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.526547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.526599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.526610 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.526628 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.526640 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.630106 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.630494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.630672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.630874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.631042 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.734459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.734514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.734531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.734555 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.734573 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.837776 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.837834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.837851 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.837873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.837891 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.940903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.940966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.940982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.941007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.941024 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:01Z","lastTransitionTime":"2026-03-18T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:01 crc kubenswrapper[4778]: I0318 09:04:01.957532 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:04:01 crc kubenswrapper[4778]: E0318 09:04:01.957782 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:04:33.957747667 +0000 UTC m=+140.532492537 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.044422 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.044807 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.044929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.045071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.045226 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.058460 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.058529 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.058577 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.058610 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.058770 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.058846 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:34.058823234 +0000 UTC m=+140.633568104 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059116 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059172 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:34.059158933 +0000 UTC m=+140.633903803 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059383 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059433 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059451 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059537 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:34.059510382 +0000 UTC m=+140.634255232 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059657 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059722 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059752 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.059891 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:34.059850291 +0000 UTC m=+140.634595311 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.148346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.148431 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.148452 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.148485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.148510 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.187238 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.187330 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.187415 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.187581 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.187909 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:02 crc kubenswrapper[4778]: E0318 09:04:02.187988 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.252521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.252581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.252598 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.252623 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.252644 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.355912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.355987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.356009 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.356039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.356062 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.459310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.459387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.459408 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.459438 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.459457 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.563888 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.564289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.564479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.564672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.564817 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.669475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.670960 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.671004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.671030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.671047 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.774311 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.774383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.774402 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.774429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.774447 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.878435 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.878510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.878532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.878570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.878595 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.981580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.981661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.981682 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.981721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:02 crc kubenswrapper[4778]: I0318 09:04:02.982181 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:02Z","lastTransitionTime":"2026-03-18T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.085062 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.085496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.085646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.085787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.085956 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.102813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.102910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.102929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.102954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.102978 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.119561 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.126655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.126742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.126767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.126798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.126823 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.143796 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.148568 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.148731 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.148834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.148939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.149025 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.166157 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.171692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.171856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.171942 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.172053 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.172132 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.186054 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.187188 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.199748 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.199799 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.199834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.199851 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.199865 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.219857 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: E0318 09:04:03.220364 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.223368 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.223462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.223479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.223594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.223633 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.326927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.326992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.327008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.327029 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.327042 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.429810 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.429868 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.429887 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.429916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.429935 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.533768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.533837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.533853 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.533880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.533901 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.605732 4778 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.637329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.637384 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.637403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.637431 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.637455 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.696896 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.699813 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.700496 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.714503 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.732333 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.744496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.744535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.744546 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.744561 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.744572 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.765226 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.781549 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.797040 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.812183 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.825592 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.841711 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.847331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.847392 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.847410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.847436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.847454 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.857784 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.874981 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.887846 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.901222 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.911623 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.950866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.950902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.950914 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.950929 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:03 crc kubenswrapper[4778]: I0318 09:04:03.950938 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:03Z","lastTransitionTime":"2026-03-18T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.052999 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.053040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.053050 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.053066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.053077 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.156290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.156331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.156340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.156353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.156362 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.186258 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.186333 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:04 crc kubenswrapper[4778]: E0318 09:04:04.186387 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.186400 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:04 crc kubenswrapper[4778]: E0318 09:04:04.186487 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:04 crc kubenswrapper[4778]: E0318 09:04:04.186641 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.202134 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.218931 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.231916 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.251058 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.258324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.258377 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.258391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.258410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.258423 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.280866 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.304286 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.315760 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.324466 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.340430 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.360160 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.362058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.362147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.362170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.362233 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.362257 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.376078 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.388959 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.399659 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.465695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.465754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.465766 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.465782 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.465794 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.569485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.569540 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.569549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.569567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.569578 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.672527 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.672593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.672603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.672619 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.672630 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.775262 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.775351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.775369 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.775406 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.775428 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.878936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.878983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.878993 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.879014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.879027 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.982166 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.982244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.982254 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.982272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:04 crc kubenswrapper[4778]: I0318 09:04:04.982284 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:04Z","lastTransitionTime":"2026-03-18T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.085319 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.085378 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.085388 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.085411 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.085431 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.188620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.188690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.188700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.188722 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.188735 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.291859 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.291916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.291927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.291949 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.291963 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.395993 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.396052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.396065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.396084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.396100 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.500419 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.500475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.500489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.500519 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.500536 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.605128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.605258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.605288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.605321 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.605346 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.708577 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.708687 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.708709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.708778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.708798 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.812487 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.812578 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.812607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.812637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.812658 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.917080 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.917159 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.917175 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.917219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:05 crc kubenswrapper[4778]: I0318 09:04:05.917235 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:05Z","lastTransitionTime":"2026-03-18T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.021274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.021354 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.021383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.021414 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.021432 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.098354 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9f2bp"] Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.098834 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.103907 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.105492 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.106608 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.106834 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.126811 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.126873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.126890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.126919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.126937 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.129519 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.139057 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.157160 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.171567 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.186468 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.186542 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.186605 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.186709 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.186756 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.186925 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.193828 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.210097 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-serviceca\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.210180 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-host\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.210252 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grrhl\" (UniqueName: \"kubernetes.io/projected/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-kube-api-access-grrhl\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.218106 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.229228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.229285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.229299 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.229317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.229330 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.232001 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.242646 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.259584 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.271440 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.283103 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.298342 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.310486 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.311305 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-serviceca\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.311389 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-host\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.311425 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grrhl\" (UniqueName: \"kubernetes.io/projected/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-kube-api-access-grrhl\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.311517 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-host\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.312348 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-serviceca\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.322841 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332653 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grrhl\" (UniqueName: \"kubernetes.io/projected/69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7-kube-api-access-grrhl\") pod \"node-ca-9f2bp\" (UID: \"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\") " pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332680 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332762 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.332820 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.426590 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9f2bp" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.437234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.437278 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.437294 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.437317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.437333 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: W0318 09:04:06.445457 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69b256e9_a9ba_4e2e_9a39_6d9ffa7fa6b7.slice/crio-d8552cce5c839edf2a4add9843ca9cf1c7697ee87571e7fc2ca14bc8665322e4 WatchSource:0}: Error finding container d8552cce5c839edf2a4add9843ca9cf1c7697ee87571e7fc2ca14bc8665322e4: Status 404 returned error can't find the container with id d8552cce5c839edf2a4add9843ca9cf1c7697ee87571e7fc2ca14bc8665322e4 Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.449031 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:06 crc kubenswrapper[4778]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 09:04:06 crc kubenswrapper[4778]: while [ true ]; Mar 18 09:04:06 crc kubenswrapper[4778]: do Mar 18 09:04:06 crc kubenswrapper[4778]: for f in $(ls /tmp/serviceca); do Mar 18 09:04:06 crc kubenswrapper[4778]: echo $f Mar 18 09:04:06 crc kubenswrapper[4778]: ca_file_path="/tmp/serviceca/${f}" Mar 18 09:04:06 crc kubenswrapper[4778]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 09:04:06 crc kubenswrapper[4778]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 09:04:06 crc kubenswrapper[4778]: if [ -e "${reg_dir_path}" ]; then Mar 18 09:04:06 crc kubenswrapper[4778]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:06 crc kubenswrapper[4778]: else Mar 18 09:04:06 crc kubenswrapper[4778]: mkdir $reg_dir_path Mar 18 09:04:06 crc kubenswrapper[4778]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:06 crc kubenswrapper[4778]: fi Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: for d in $(ls /etc/docker/certs.d); do Mar 18 09:04:06 crc kubenswrapper[4778]: echo $d Mar 18 09:04:06 crc kubenswrapper[4778]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 09:04:06 crc kubenswrapper[4778]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 09:04:06 crc kubenswrapper[4778]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 09:04:06 crc kubenswrapper[4778]: rm -rf /etc/docker/certs.d/$d Mar 18 09:04:06 crc kubenswrapper[4778]: fi Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: sleep 60 & wait ${!} Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grrhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-9f2bp_openshift-image-registry(69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:06 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.451343 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-9f2bp" podUID="69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.540948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.541017 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.541039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.541067 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.541090 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.644082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.644134 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.644149 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.644169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.644185 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.710684 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9f2bp" event={"ID":"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7","Type":"ContainerStarted","Data":"d8552cce5c839edf2a4add9843ca9cf1c7697ee87571e7fc2ca14bc8665322e4"} Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.712992 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:06 crc kubenswrapper[4778]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 09:04:06 crc kubenswrapper[4778]: while [ true ]; Mar 18 09:04:06 crc kubenswrapper[4778]: do Mar 18 09:04:06 crc kubenswrapper[4778]: for f in $(ls /tmp/serviceca); do Mar 18 09:04:06 crc kubenswrapper[4778]: echo $f Mar 18 09:04:06 crc kubenswrapper[4778]: ca_file_path="/tmp/serviceca/${f}" Mar 18 09:04:06 crc kubenswrapper[4778]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 09:04:06 crc kubenswrapper[4778]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 09:04:06 crc kubenswrapper[4778]: if [ -e "${reg_dir_path}" ]; then Mar 18 09:04:06 crc kubenswrapper[4778]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:06 crc kubenswrapper[4778]: else Mar 18 09:04:06 crc kubenswrapper[4778]: mkdir $reg_dir_path Mar 18 09:04:06 crc kubenswrapper[4778]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:06 crc kubenswrapper[4778]: fi Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: for d in $(ls /etc/docker/certs.d); do Mar 18 09:04:06 crc kubenswrapper[4778]: echo $d Mar 18 09:04:06 crc kubenswrapper[4778]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 09:04:06 crc kubenswrapper[4778]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 09:04:06 crc kubenswrapper[4778]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 09:04:06 crc kubenswrapper[4778]: rm -rf /etc/docker/certs.d/$d Mar 18 09:04:06 crc kubenswrapper[4778]: fi Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: sleep 60 & wait ${!} Mar 18 09:04:06 crc kubenswrapper[4778]: done Mar 18 09:04:06 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grrhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-9f2bp_openshift-image-registry(69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:06 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:06 crc kubenswrapper[4778]: E0318 09:04:06.714315 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-9f2bp" podUID="69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.731521 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.743792 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.747590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.747642 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.747656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.747674 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.747686 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.773056 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.791828 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.806479 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.827088 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.843257 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.850956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.851018 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.851036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.851059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.851076 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.865264 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.884639 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.895615 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.906484 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.918736 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.930313 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.951763 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.953845 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.953922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.953937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.953962 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:06 crc kubenswrapper[4778]: I0318 09:04:06.953979 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:06Z","lastTransitionTime":"2026-03-18T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.056579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.056620 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.056635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.056652 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.056681 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.159652 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.159720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.159739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.159766 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.159784 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: E0318 09:04:07.188757 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:07 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:07 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:07 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:07 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:07 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:07 crc kubenswrapper[4778]: fi Mar 18 09:04:07 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:04:07 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:04:07 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:04:07 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:04:07 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:04:07 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:04:07 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:04:07 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:04:07 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:04:07 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:04:07 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:04:07 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:04:07 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:04:07 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:04:07 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:04:07 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:04:07 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:04:07 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:07 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:07 crc kubenswrapper[4778]: E0318 09:04:07.188982 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:07 crc kubenswrapper[4778]: E0318 09:04:07.190115 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:04:07 crc kubenswrapper[4778]: E0318 09:04:07.191749 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:07 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:07 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:07 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:07 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:07 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:07 crc kubenswrapper[4778]: fi Mar 18 09:04:07 crc kubenswrapper[4778]: Mar 18 09:04:07 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:04:07 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:04:07 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:04:07 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:04:07 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:04:07 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:07 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:07 crc kubenswrapper[4778]: E0318 09:04:07.193001 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.262158 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.262529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.262619 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.262717 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.262810 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.366686 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.366732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.366741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.366758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.366770 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.469280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.469474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.469495 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.469518 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.469536 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.573088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.573146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.573165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.573190 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.573251 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.678065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.678600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.678809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.679044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.679278 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.783685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.783745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.783761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.783787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.783805 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.886267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.886572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.886663 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.886757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.886837 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.990251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.990372 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.990395 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.990424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:07 crc kubenswrapper[4778]: I0318 09:04:07.990442 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:07Z","lastTransitionTime":"2026-03-18T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.094265 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.094340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.094364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.094393 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.094413 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.187675 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:08 crc kubenswrapper[4778]: E0318 09:04:08.188508 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.188568 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.188839 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:08 crc kubenswrapper[4778]: E0318 09:04:08.191897 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:08 crc kubenswrapper[4778]: E0318 09:04:08.191999 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.196363 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.196409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.196423 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.196439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.196451 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.299708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.300117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.300188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.300288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.300353 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.402881 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.402920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.402928 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.402944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.402953 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.505802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.505836 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.505843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.505872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.505882 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.609725 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.609796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.609818 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.609846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.609866 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.713659 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.713734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.713758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.713788 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.713816 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.817133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.817276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.817296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.817325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.817345 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.920920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.921011 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.921045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.921086 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:08 crc kubenswrapper[4778]: I0318 09:04:08.921114 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:08Z","lastTransitionTime":"2026-03-18T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.024660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.024725 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.024743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.024769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.024786 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.127778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.128544 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.128622 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.128789 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.129014 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: E0318 09:04:09.190660 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:09 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:09 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:09 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:04:09 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:04:09 crc kubenswrapper[4778]: else Mar 18 09:04:09 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:04:09 crc kubenswrapper[4778]: exit 1 Mar 18 09:04:09 crc kubenswrapper[4778]: fi Mar 18 09:04:09 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:04:09 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:09 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:09 crc kubenswrapper[4778]: E0318 09:04:09.191946 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.233037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.233107 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.233124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.233152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.233175 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.335852 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.335907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.335919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.335938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.335951 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.439547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.439631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.439648 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.439672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.439690 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.543069 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.543138 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.543167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.543232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.543259 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.647325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.647416 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.647440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.647468 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.647489 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.750818 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.750891 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.750908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.750931 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.750948 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.854168 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.854241 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.854254 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.854274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.854291 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.957856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.957921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.958066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.958127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:09 crc kubenswrapper[4778]: I0318 09:04:09.958149 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:09Z","lastTransitionTime":"2026-03-18T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.061906 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.061948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.061959 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.061978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.061990 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.165688 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.165753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.165771 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.165798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.165817 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.186756 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.187027 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.187052 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:10 crc kubenswrapper[4778]: E0318 09:04:10.187140 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:10 crc kubenswrapper[4778]: E0318 09:04:10.187309 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:10 crc kubenswrapper[4778]: E0318 09:04:10.186964 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.269060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.269135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.269154 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.269179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.269241 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.373046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.373232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.373261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.373295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.373318 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.477351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.477403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.477421 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.477446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.477464 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.581286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.581344 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.581361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.581390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.581413 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.685249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.685311 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.685331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.685358 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.685381 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.788907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.788972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.788990 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.789013 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.789033 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.892336 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.892409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.892445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.892489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.892525 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.996191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.996286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.996310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.996340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:10 crc kubenswrapper[4778]: I0318 09:04:10.996359 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:10Z","lastTransitionTime":"2026-03-18T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.099655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.099718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.099735 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.099759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.099775 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.197060 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.202814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.202963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.203030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.203066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.203124 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.306440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.306488 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.306502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.306528 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.306545 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.410276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.410383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.410452 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.410486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.410554 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.513618 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.513666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.513685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.513707 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.513724 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.615804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.615867 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.615881 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.615911 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.615926 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.718681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.719032 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.719148 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.719314 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.719413 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.822679 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.822734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.822751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.822774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.822795 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.926162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.926232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.926244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.926260 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.926272 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:11Z","lastTransitionTime":"2026-03-18T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.997816 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f"] Mar 18 09:04:11 crc kubenswrapper[4778]: I0318 09:04:11.998841 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.001846 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.003735 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.018697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.029761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.035827 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.035890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.035922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.035970 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.038733 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.071015 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.075813 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6q8m\" (UniqueName: \"kubernetes.io/projected/19777429-4133-4e70-b2dd-c61c54abdec4-kube-api-access-b6q8m\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.075890 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.076041 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19777429-4133-4e70-b2dd-c61c54abdec4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.076271 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.086383 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.113313 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.130135 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.140093 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.140343 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.140547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.140786 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.141023 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.143490 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.164337 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.176790 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.177144 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6q8m\" (UniqueName: \"kubernetes.io/projected/19777429-4133-4e70-b2dd-c61c54abdec4-kube-api-access-b6q8m\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.177648 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.178358 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19777429-4133-4e70-b2dd-c61c54abdec4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.176828 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.177845 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.178733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19777429-4133-4e70-b2dd-c61c54abdec4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.184364 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19777429-4133-4e70-b2dd-c61c54abdec4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.186489 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.186692 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.187022 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.187128 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.187394 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.187493 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.195977 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.198733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6q8m\" (UniqueName: \"kubernetes.io/projected/19777429-4133-4e70-b2dd-c61c54abdec4-kube-api-access-b6q8m\") pod \"ovnkube-control-plane-749d76644c-7262f\" (UID: \"19777429-4133-4e70-b2dd-c61c54abdec4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.208835 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.225817 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.241422 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.243584 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.243650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.243674 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.243704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.243725 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.255984 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.273434 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.285702 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.320249 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.342437 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:12 crc kubenswrapper[4778]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:12 crc kubenswrapper[4778]: set -euo pipefail Mar 18 09:04:12 crc kubenswrapper[4778]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 18 09:04:12 crc kubenswrapper[4778]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 18 09:04:12 crc kubenswrapper[4778]: # As the secret mount is optional we must wait for the files to be present. Mar 18 09:04:12 crc kubenswrapper[4778]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 18 09:04:12 crc kubenswrapper[4778]: TS=$(date +%s) Mar 18 09:04:12 crc kubenswrapper[4778]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 18 09:04:12 crc kubenswrapper[4778]: HAS_LOGGED_INFO=0 Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: log_missing_certs(){ Mar 18 09:04:12 crc kubenswrapper[4778]: CUR_TS=$(date +%s) Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 18 09:04:12 crc kubenswrapper[4778]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 18 09:04:12 crc kubenswrapper[4778]: HAS_LOGGED_INFO=1 Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: } Mar 18 09:04:12 crc kubenswrapper[4778]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 18 09:04:12 crc kubenswrapper[4778]: log_missing_certs Mar 18 09:04:12 crc kubenswrapper[4778]: sleep 5 Mar 18 09:04:12 crc kubenswrapper[4778]: done Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 18 09:04:12 crc kubenswrapper[4778]: exec /usr/bin/kube-rbac-proxy \ Mar 18 09:04:12 crc kubenswrapper[4778]: --logtostderr \ Mar 18 09:04:12 crc kubenswrapper[4778]: --secure-listen-address=:9108 \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 18 09:04:12 crc kubenswrapper[4778]: --upstream=http://127.0.0.1:29108/ \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-private-key-file=${TLS_PK} \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-cert-file=${TLS_CERT} Mar 18 09:04:12 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6q8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-7262f_openshift-ovn-kubernetes(19777429-4133-4e70-b2dd-c61c54abdec4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:12 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.351705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.351769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.351791 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.351820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.351843 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.361889 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:12 crc kubenswrapper[4778]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:12 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:12 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_join_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_join_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_transit_switch_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_transit_switch_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: dns_name_resolver_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "false" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: persistent_ips_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "true" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: # This is needed so that converting clusters from GA to TP Mar 18 09:04:12 crc kubenswrapper[4778]: # will rollout control plane pods as well Mar 18 09:04:12 crc kubenswrapper[4778]: network_segmentation_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: multi_network_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "true" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: multi_network_enabled_flag="--enable-multi-network" Mar 18 09:04:12 crc kubenswrapper[4778]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 18 09:04:12 crc kubenswrapper[4778]: exec /usr/bin/ovnkube \ Mar 18 09:04:12 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:04:12 crc kubenswrapper[4778]: --init-cluster-manager "${K8S_NODE}" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 18 09:04:12 crc kubenswrapper[4778]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-bind-address "127.0.0.1:29108" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-enable-pprof \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-enable-config-duration \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v4_join_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v6_join_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${dns_name_resolver_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${persistent_ips_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${multi_network_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${network_segmentation_enabled_flag} Mar 18 09:04:12 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6q8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-7262f_openshift-ovn-kubernetes(19777429-4133-4e70-b2dd-c61c54abdec4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:12 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.362964 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" podUID="19777429-4133-4e70-b2dd-c61c54abdec4" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.459041 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.459089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.459106 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.459129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.459145 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.562127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.562185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.562222 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.562244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.562258 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.664446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.664538 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.664549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.664565 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.664577 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.706634 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9bc7s"] Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.707110 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.707180 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.729689 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" event={"ID":"19777429-4133-4e70-b2dd-c61c54abdec4","Type":"ContainerStarted","Data":"b77ec21dbeb1ef96b33093af249fe16903ad69bddc5ec9e4bf3972b97e6e679a"} Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.731618 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:12 crc kubenswrapper[4778]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:12 crc kubenswrapper[4778]: set -euo pipefail Mar 18 09:04:12 crc kubenswrapper[4778]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 18 09:04:12 crc kubenswrapper[4778]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 18 09:04:12 crc kubenswrapper[4778]: # As the secret mount is optional we must wait for the files to be present. Mar 18 09:04:12 crc kubenswrapper[4778]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 18 09:04:12 crc kubenswrapper[4778]: TS=$(date +%s) Mar 18 09:04:12 crc kubenswrapper[4778]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 18 09:04:12 crc kubenswrapper[4778]: HAS_LOGGED_INFO=0 Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: log_missing_certs(){ Mar 18 09:04:12 crc kubenswrapper[4778]: CUR_TS=$(date +%s) Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 18 09:04:12 crc kubenswrapper[4778]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 18 09:04:12 crc kubenswrapper[4778]: HAS_LOGGED_INFO=1 Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: } Mar 18 09:04:12 crc kubenswrapper[4778]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 18 09:04:12 crc kubenswrapper[4778]: log_missing_certs Mar 18 09:04:12 crc kubenswrapper[4778]: sleep 5 Mar 18 09:04:12 crc kubenswrapper[4778]: done Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 18 09:04:12 crc kubenswrapper[4778]: exec /usr/bin/kube-rbac-proxy \ Mar 18 09:04:12 crc kubenswrapper[4778]: --logtostderr \ Mar 18 09:04:12 crc kubenswrapper[4778]: --secure-listen-address=:9108 \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 18 09:04:12 crc kubenswrapper[4778]: --upstream=http://127.0.0.1:29108/ \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-private-key-file=${TLS_PK} \ Mar 18 09:04:12 crc kubenswrapper[4778]: --tls-cert-file=${TLS_CERT} Mar 18 09:04:12 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6q8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-7262f_openshift-ovn-kubernetes(19777429-4133-4e70-b2dd-c61c54abdec4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:12 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.734053 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:12 crc kubenswrapper[4778]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:12 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:12 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_join_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_join_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_transit_switch_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_transit_switch_subnet_opt= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "" != "" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: dns_name_resolver_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "false" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: persistent_ips_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "true" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: # This is needed so that converting clusters from GA to TP Mar 18 09:04:12 crc kubenswrapper[4778]: # will rollout control plane pods as well Mar 18 09:04:12 crc kubenswrapper[4778]: network_segmentation_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: multi_network_enabled_flag= Mar 18 09:04:12 crc kubenswrapper[4778]: if [[ "true" == "true" ]]; then Mar 18 09:04:12 crc kubenswrapper[4778]: multi_network_enabled_flag="--enable-multi-network" Mar 18 09:04:12 crc kubenswrapper[4778]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 18 09:04:12 crc kubenswrapper[4778]: fi Mar 18 09:04:12 crc kubenswrapper[4778]: Mar 18 09:04:12 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 18 09:04:12 crc kubenswrapper[4778]: exec /usr/bin/ovnkube \ Mar 18 09:04:12 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:04:12 crc kubenswrapper[4778]: --init-cluster-manager "${K8S_NODE}" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 18 09:04:12 crc kubenswrapper[4778]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-bind-address "127.0.0.1:29108" \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-enable-pprof \ Mar 18 09:04:12 crc kubenswrapper[4778]: --metrics-enable-config-duration \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v4_join_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v6_join_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${dns_name_resolver_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${persistent_ips_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${multi_network_enabled_flag} \ Mar 18 09:04:12 crc kubenswrapper[4778]: ${network_segmentation_enabled_flag} Mar 18 09:04:12 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6q8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-7262f_openshift-ovn-kubernetes(19777429-4133-4e70-b2dd-c61c54abdec4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:12 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.734690 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.735683 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" podUID="19777429-4133-4e70-b2dd-c61c54abdec4" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.748331 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.766034 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.767142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.767207 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.767218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.767234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.767245 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.783653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.783705 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d88r\" (UniqueName: \"kubernetes.io/projected/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-kube-api-access-9d88r\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.788888 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.801266 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.812691 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.824648 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.834225 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.848399 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.859141 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.869885 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.870608 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.870663 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.870687 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.870717 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.870735 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.877955 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.884982 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.885075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d88r\" (UniqueName: \"kubernetes.io/projected/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-kube-api-access-9d88r\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.885397 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:12 crc kubenswrapper[4778]: E0318 09:04:12.885516 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:13.385490789 +0000 UTC m=+119.960235619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.889146 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.901270 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.907079 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d88r\" (UniqueName: \"kubernetes.io/projected/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-kube-api-access-9d88r\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.920386 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.938290 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.966743 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.974163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.974238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.974252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.974272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.974286 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:12Z","lastTransitionTime":"2026-03-18T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:12 crc kubenswrapper[4778]: I0318 09:04:12.983925 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.004545 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.023319 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.050796 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.077859 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.078123 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.078265 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.078394 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.078502 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.080723 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.092851 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.104414 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.125503 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.139897 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.155710 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.174501 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.181614 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.181683 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.181703 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.181761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.181781 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.189008 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:13 crc kubenswrapper[4778]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:13 crc kubenswrapper[4778]: set -uo pipefail Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 09:04:13 crc kubenswrapper[4778]: HOSTS_FILE="/etc/hosts" Mar 18 09:04:13 crc kubenswrapper[4778]: TEMP_FILE="/etc/hosts.tmp" Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: # Make a temporary file with the old hosts file's attributes. Mar 18 09:04:13 crc kubenswrapper[4778]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 09:04:13 crc kubenswrapper[4778]: echo "Failed to preserve hosts file. Exiting." Mar 18 09:04:13 crc kubenswrapper[4778]: exit 1 Mar 18 09:04:13 crc kubenswrapper[4778]: fi Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: while true; do Mar 18 09:04:13 crc kubenswrapper[4778]: declare -A svc_ips Mar 18 09:04:13 crc kubenswrapper[4778]: for svc in "${services[@]}"; do Mar 18 09:04:13 crc kubenswrapper[4778]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 09:04:13 crc kubenswrapper[4778]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 09:04:13 crc kubenswrapper[4778]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 09:04:13 crc kubenswrapper[4778]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 09:04:13 crc kubenswrapper[4778]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:13 crc kubenswrapper[4778]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:13 crc kubenswrapper[4778]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 09:04:13 crc kubenswrapper[4778]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 09:04:13 crc kubenswrapper[4778]: for i in ${!cmds[*]} Mar 18 09:04:13 crc kubenswrapper[4778]: do Mar 18 09:04:13 crc kubenswrapper[4778]: ips=($(eval "${cmds[i]}")) Mar 18 09:04:13 crc kubenswrapper[4778]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 09:04:13 crc kubenswrapper[4778]: svc_ips["${svc}"]="${ips[@]}" Mar 18 09:04:13 crc kubenswrapper[4778]: break Mar 18 09:04:13 crc kubenswrapper[4778]: fi Mar 18 09:04:13 crc kubenswrapper[4778]: done Mar 18 09:04:13 crc kubenswrapper[4778]: done Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: # Update /etc/hosts only if we get valid service IPs Mar 18 09:04:13 crc kubenswrapper[4778]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 09:04:13 crc kubenswrapper[4778]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 09:04:13 crc kubenswrapper[4778]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 09:04:13 crc kubenswrapper[4778]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 09:04:13 crc kubenswrapper[4778]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 09:04:13 crc kubenswrapper[4778]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 09:04:13 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:13 crc kubenswrapper[4778]: continue Mar 18 09:04:13 crc kubenswrapper[4778]: fi Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: # Append resolver entries for services Mar 18 09:04:13 crc kubenswrapper[4778]: rc=0 Mar 18 09:04:13 crc kubenswrapper[4778]: for svc in "${!svc_ips[@]}"; do Mar 18 09:04:13 crc kubenswrapper[4778]: for ip in ${svc_ips[${svc}]}; do Mar 18 09:04:13 crc kubenswrapper[4778]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 09:04:13 crc kubenswrapper[4778]: done Mar 18 09:04:13 crc kubenswrapper[4778]: done Mar 18 09:04:13 crc kubenswrapper[4778]: if [[ $rc -ne 0 ]]; then Mar 18 09:04:13 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:13 crc kubenswrapper[4778]: continue Mar 18 09:04:13 crc kubenswrapper[4778]: fi Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: Mar 18 09:04:13 crc kubenswrapper[4778]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 09:04:13 crc kubenswrapper[4778]: # Replace /etc/hosts with our modified version if needed Mar 18 09:04:13 crc kubenswrapper[4778]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 09:04:13 crc kubenswrapper[4778]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 09:04:13 crc kubenswrapper[4778]: fi Mar 18 09:04:13 crc kubenswrapper[4778]: sleep 60 & wait Mar 18 09:04:13 crc kubenswrapper[4778]: unset svc_ips Mar 18 09:04:13 crc kubenswrapper[4778]: done Mar 18 09:04:13 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f9b2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-dfnnp_openshift-dns(8cf64307-e191-476a-902b-93001adc0b16): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:13 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.189135 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:13 crc kubenswrapper[4778]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 09:04:13 crc kubenswrapper[4778]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 09:04:13 crc kubenswrapper[4778]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2zsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:13 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.190699 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-dfnnp" podUID="8cf64307-e191-476a-902b-93001adc0b16" Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.190779 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.193014 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.208155 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.227472 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.241418 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.255977 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.269248 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.284934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.284987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.285001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.285021 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.285033 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.389288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.389364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.389381 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.389408 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.389431 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.392034 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.392436 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.392823 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:14.392793518 +0000 UTC m=+120.967538398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.493084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.493135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.493153 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.493177 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.493237 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.536570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.536636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.536656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.536686 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.536705 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.554015 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.559552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.559626 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.559646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.559673 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.559692 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.578075 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.583667 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.583889 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.584024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.584286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.584533 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.601615 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.607914 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.607997 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.608017 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.608440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.608473 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.624006 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.629018 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.629069 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.629087 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.629110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.629127 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.646082 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:13 crc kubenswrapper[4778]: E0318 09:04:13.646364 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.648820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.648900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.648924 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.648956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.648982 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.752613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.752678 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.752690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.752715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.752735 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.855056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.855115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.855133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.855155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.855171 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.958563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.958604 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.958614 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.958628 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:13 crc kubenswrapper[4778]: I0318 09:04:13.958637 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:13Z","lastTransitionTime":"2026-03-18T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.061963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.062029 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.062045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.062104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.062122 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:14Z","lastTransitionTime":"2026-03-18T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.162708 4778 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.186701 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.186888 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.187079 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.187281 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.187393 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.187632 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.187758 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.188227 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.190634 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:14 crc kubenswrapper[4778]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 09:04:14 crc kubenswrapper[4778]: apiVersion: v1 Mar 18 09:04:14 crc kubenswrapper[4778]: clusters: Mar 18 09:04:14 crc kubenswrapper[4778]: - cluster: Mar 18 09:04:14 crc kubenswrapper[4778]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 09:04:14 crc kubenswrapper[4778]: server: https://api-int.crc.testing:6443 Mar 18 09:04:14 crc kubenswrapper[4778]: name: default-cluster Mar 18 09:04:14 crc kubenswrapper[4778]: contexts: Mar 18 09:04:14 crc kubenswrapper[4778]: - context: Mar 18 09:04:14 crc kubenswrapper[4778]: cluster: default-cluster Mar 18 09:04:14 crc kubenswrapper[4778]: namespace: default Mar 18 09:04:14 crc kubenswrapper[4778]: user: default-auth Mar 18 09:04:14 crc kubenswrapper[4778]: name: default-context Mar 18 09:04:14 crc kubenswrapper[4778]: current-context: default-context Mar 18 09:04:14 crc kubenswrapper[4778]: kind: Config Mar 18 09:04:14 crc kubenswrapper[4778]: preferences: {} Mar 18 09:04:14 crc kubenswrapper[4778]: users: Mar 18 09:04:14 crc kubenswrapper[4778]: - name: default-auth Mar 18 09:04:14 crc kubenswrapper[4778]: user: Mar 18 09:04:14 crc kubenswrapper[4778]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:14 crc kubenswrapper[4778]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 09:04:14 crc kubenswrapper[4778]: EOF Mar 18 09:04:14 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8g6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:14 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.191738 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.204473 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.219728 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.233396 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.259040 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.277389 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.283969 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.285885 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.294693 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.313752 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.327789 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.338903 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.352866 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.365257 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.380822 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.393782 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.402818 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.404165 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.404307 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:14 crc kubenswrapper[4778]: E0318 09:04:14.404370 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:16.404352053 +0000 UTC m=+122.979096893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.413382 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:14 crc kubenswrapper[4778]: I0318 09:04:14.423569 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:15 crc kubenswrapper[4778]: E0318 09:04:15.189220 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cxhf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-xkfx8_openshift-multus(b1698c21-24a7-4338-a0ad-dd110c1ba2f2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:15 crc kubenswrapper[4778]: E0318 09:04:15.189795 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:15 crc kubenswrapper[4778]: E0318 09:04:15.190806 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" podUID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" Mar 18 09:04:15 crc kubenswrapper[4778]: E0318 09:04:15.192731 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gjjcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:15 crc kubenswrapper[4778]: E0318 09:04:15.194017 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:04:16 crc kubenswrapper[4778]: I0318 09:04:16.186321 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:16 crc kubenswrapper[4778]: I0318 09:04:16.186362 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.186509 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.186891 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:16 crc kubenswrapper[4778]: I0318 09:04:16.187349 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:16 crc kubenswrapper[4778]: I0318 09:04:16.187412 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.187773 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.187614 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:16 crc kubenswrapper[4778]: I0318 09:04:16.430737 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.430931 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:16 crc kubenswrapper[4778]: E0318 09:04:16.431094 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:20.431052308 +0000 UTC m=+127.005797188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:17 crc kubenswrapper[4778]: I0318 09:04:17.913570 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:04:17 crc kubenswrapper[4778]: I0318 09:04:17.932959 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:17 crc kubenswrapper[4778]: I0318 09:04:17.948300 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:17 crc kubenswrapper[4778]: I0318 09:04:17.968581 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:17 crc kubenswrapper[4778]: I0318 09:04:17.989275 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.005426 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.026119 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.039497 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.053426 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.066100 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.078854 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.101059 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.121150 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.134984 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.147123 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.162518 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.173899 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.184929 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.186235 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.186260 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:18 crc kubenswrapper[4778]: E0318 09:04:18.186478 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.186508 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:18 crc kubenswrapper[4778]: E0318 09:04:18.186543 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:18 crc kubenswrapper[4778]: I0318 09:04:18.186278 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:18 crc kubenswrapper[4778]: E0318 09:04:18.186673 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:18 crc kubenswrapper[4778]: E0318 09:04:18.186839 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:19 crc kubenswrapper[4778]: E0318 09:04:19.285815 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:20 crc kubenswrapper[4778]: I0318 09:04:20.187000 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:20 crc kubenswrapper[4778]: I0318 09:04:20.187130 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:20 crc kubenswrapper[4778]: I0318 09:04:20.187292 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.187350 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:20 crc kubenswrapper[4778]: I0318 09:04:20.187476 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.187699 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.188056 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.188112 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.189032 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:20 crc kubenswrapper[4778]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 09:04:20 crc kubenswrapper[4778]: while [ true ]; Mar 18 09:04:20 crc kubenswrapper[4778]: do Mar 18 09:04:20 crc kubenswrapper[4778]: for f in $(ls /tmp/serviceca); do Mar 18 09:04:20 crc kubenswrapper[4778]: echo $f Mar 18 09:04:20 crc kubenswrapper[4778]: ca_file_path="/tmp/serviceca/${f}" Mar 18 09:04:20 crc kubenswrapper[4778]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 09:04:20 crc kubenswrapper[4778]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 09:04:20 crc kubenswrapper[4778]: if [ -e "${reg_dir_path}" ]; then Mar 18 09:04:20 crc kubenswrapper[4778]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:20 crc kubenswrapper[4778]: else Mar 18 09:04:20 crc kubenswrapper[4778]: mkdir $reg_dir_path Mar 18 09:04:20 crc kubenswrapper[4778]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 09:04:20 crc kubenswrapper[4778]: fi Mar 18 09:04:20 crc kubenswrapper[4778]: done Mar 18 09:04:20 crc kubenswrapper[4778]: for d in $(ls /etc/docker/certs.d); do Mar 18 09:04:20 crc kubenswrapper[4778]: echo $d Mar 18 09:04:20 crc kubenswrapper[4778]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 09:04:20 crc kubenswrapper[4778]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 09:04:20 crc kubenswrapper[4778]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 09:04:20 crc kubenswrapper[4778]: rm -rf /etc/docker/certs.d/$d Mar 18 09:04:20 crc kubenswrapper[4778]: fi Mar 18 09:04:20 crc kubenswrapper[4778]: done Mar 18 09:04:20 crc kubenswrapper[4778]: sleep 60 & wait ${!} Mar 18 09:04:20 crc kubenswrapper[4778]: done Mar 18 09:04:20 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grrhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-9f2bp_openshift-image-registry(69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:20 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.190239 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-9f2bp" podUID="69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.190524 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:20 crc kubenswrapper[4778]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 09:04:20 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:20 crc kubenswrapper[4778]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 09:04:20 crc kubenswrapper[4778]: source /etc/kubernetes/apiserver-url.env Mar 18 09:04:20 crc kubenswrapper[4778]: else Mar 18 09:04:20 crc kubenswrapper[4778]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 09:04:20 crc kubenswrapper[4778]: exit 1 Mar 18 09:04:20 crc kubenswrapper[4778]: fi Mar 18 09:04:20 crc kubenswrapper[4778]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 09:04:20 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:20 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.190908 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.191701 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.192306 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 09:04:20 crc kubenswrapper[4778]: I0318 09:04:20.475861 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.475901 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:20 crc kubenswrapper[4778]: E0318 09:04:20.476137 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:28.476115577 +0000 UTC m=+135.050860657 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:22 crc kubenswrapper[4778]: I0318 09:04:22.186474 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:22 crc kubenswrapper[4778]: I0318 09:04:22.186581 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:22 crc kubenswrapper[4778]: I0318 09:04:22.186590 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:22 crc kubenswrapper[4778]: I0318 09:04:22.186729 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.186739 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.186861 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.187341 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.187490 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.189727 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:22 crc kubenswrapper[4778]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:22 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:22 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:22 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:22 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:22 crc kubenswrapper[4778]: fi Mar 18 09:04:22 crc kubenswrapper[4778]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 09:04:22 crc kubenswrapper[4778]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 09:04:22 crc kubenswrapper[4778]: ho_enable="--enable-hybrid-overlay" Mar 18 09:04:22 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 09:04:22 crc kubenswrapper[4778]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 09:04:22 crc kubenswrapper[4778]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 09:04:22 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:04:22 crc kubenswrapper[4778]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 09:04:22 crc kubenswrapper[4778]: --webhook-host=127.0.0.1 \ Mar 18 09:04:22 crc kubenswrapper[4778]: --webhook-port=9743 \ Mar 18 09:04:22 crc kubenswrapper[4778]: ${ho_enable} \ Mar 18 09:04:22 crc kubenswrapper[4778]: --enable-interconnect \ Mar 18 09:04:22 crc kubenswrapper[4778]: --disable-approver \ Mar 18 09:04:22 crc kubenswrapper[4778]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 09:04:22 crc kubenswrapper[4778]: --wait-for-kubernetes-api=200s \ Mar 18 09:04:22 crc kubenswrapper[4778]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 09:04:22 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:04:22 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:22 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.192989 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:04:22 crc kubenswrapper[4778]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 09:04:22 crc kubenswrapper[4778]: if [[ -f "/env/_master" ]]; then Mar 18 09:04:22 crc kubenswrapper[4778]: set -o allexport Mar 18 09:04:22 crc kubenswrapper[4778]: source "/env/_master" Mar 18 09:04:22 crc kubenswrapper[4778]: set +o allexport Mar 18 09:04:22 crc kubenswrapper[4778]: fi Mar 18 09:04:22 crc kubenswrapper[4778]: Mar 18 09:04:22 crc kubenswrapper[4778]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 09:04:22 crc kubenswrapper[4778]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 09:04:22 crc kubenswrapper[4778]: --disable-webhook \ Mar 18 09:04:22 crc kubenswrapper[4778]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 09:04:22 crc kubenswrapper[4778]: --loglevel="${LOGLEVEL}" Mar 18 09:04:22 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 09:04:22 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:04:22 crc kubenswrapper[4778]: E0318 09:04:22.194376 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.865149 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.865241 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.865265 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.865295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.865317 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:23Z","lastTransitionTime":"2026-03-18T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.881779 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.887837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.887895 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.887912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.887936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.887956 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:23Z","lastTransitionTime":"2026-03-18T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.907840 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.908156 4778 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.912982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.913039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.913059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.913085 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.913105 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:23Z","lastTransitionTime":"2026-03-18T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.930679 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.935507 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.935562 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.935580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.935604 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.935625 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:23Z","lastTransitionTime":"2026-03-18T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.951038 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.956271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.956334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.956348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.956366 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:23 crc kubenswrapper[4778]: I0318 09:04:23.956378 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:23Z","lastTransitionTime":"2026-03-18T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.970951 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:23 crc kubenswrapper[4778]: E0318 09:04:23.971343 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.187090 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.187148 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.187303 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:24 crc kubenswrapper[4778]: E0318 09:04:24.187505 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:24 crc kubenswrapper[4778]: E0318 09:04:24.187745 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.187798 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:24 crc kubenswrapper[4778]: E0318 09:04:24.187876 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:24 crc kubenswrapper[4778]: E0318 09:04:24.188146 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.202754 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.219173 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.238382 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.266101 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: E0318 09:04:24.286675 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.289228 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.320814 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.337603 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.347150 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.362234 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.372956 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.411111 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.442442 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.459886 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.474001 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.488020 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.500796 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:24 crc kubenswrapper[4778]: I0318 09:04:24.508189 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.186496 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.186566 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.186495 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:26 crc kubenswrapper[4778]: E0318 09:04:26.186742 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.186812 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:26 crc kubenswrapper[4778]: E0318 09:04:26.187481 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:26 crc kubenswrapper[4778]: E0318 09:04:26.187628 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:26 crc kubenswrapper[4778]: E0318 09:04:26.187752 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.781749 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dfnnp" event={"ID":"8cf64307-e191-476a-902b-93001adc0b16","Type":"ContainerStarted","Data":"f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627"} Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.783958 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerStarted","Data":"27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5"} Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.797937 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.809791 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.819111 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.829461 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.848750 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.871498 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.897762 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.914138 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.922327 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.937022 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.947285 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.956871 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.971603 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:26 crc kubenswrapper[4778]: I0318 09:04:26.984251 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.000088 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.011620 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.024307 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.036331 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.051398 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.064576 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.075857 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.091263 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.102264 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.111987 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.127261 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.143442 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.164088 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.176748 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.212683 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.230516 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.240488 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.253409 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.262249 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.271545 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.789267 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88"} Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.789331 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc"} Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.793174 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" event={"ID":"19777429-4133-4e70-b2dd-c61c54abdec4","Type":"ContainerStarted","Data":"33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee"} Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.793229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" event={"ID":"19777429-4133-4e70-b2dd-c61c54abdec4","Type":"ContainerStarted","Data":"b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888"} Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.795349 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8" exitCode=0 Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.795381 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8"} Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.799785 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.806973 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.816951 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.826670 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.844680 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.856643 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.885412 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.894851 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.904917 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.916803 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.924174 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.932429 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.939895 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.951355 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.963412 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.976175 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:27 crc kubenswrapper[4778]: I0318 09:04:27.989596 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.002959 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.012924 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.030514 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.042128 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.053348 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.062138 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.069622 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.085036 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.096089 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.114435 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.133516 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.153724 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.165343 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.172671 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.185561 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.186946 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.186970 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.187014 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.187091 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.187092 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.187217 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.187289 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.187669 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.200705 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.212681 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.566342 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.566609 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:28 crc kubenswrapper[4778]: E0318 09:04:28.566894 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:04:44.566870431 +0000 UTC m=+151.141615271 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.802813 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" exitCode=0 Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.802963 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107"} Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.810422 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d" exitCode=0 Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.810485 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d"} Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.824365 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.840992 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.857919 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.872889 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.884699 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.896416 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.906137 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.923642 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.935743 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.947938 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.960429 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.970260 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.984892 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:28 crc kubenswrapper[4778]: I0318 09:04:28.995901 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.006749 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.027958 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.041753 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.060791 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.078001 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.089434 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.103220 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.115252 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.126517 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.137852 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.151771 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.165704 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.177649 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.201679 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.214006 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.226235 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.248034 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.264820 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: E0318 09:04:29.289889 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.291699 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.308055 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.820155 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.820541 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.820560 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.824831 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917"} Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.824834 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917" exitCode=0 Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.843697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.860677 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.874903 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.888108 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.901733 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.914085 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.925381 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.936508 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.949621 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.965556 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.976383 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:29 crc kubenswrapper[4778]: I0318 09:04:29.993834 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.005698 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.015143 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.027962 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.042042 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.053751 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.186829 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.186887 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:30 crc kubenswrapper[4778]: E0318 09:04:30.187009 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.187478 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.187604 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:30 crc kubenswrapper[4778]: E0318 09:04:30.187729 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:30 crc kubenswrapper[4778]: E0318 09:04:30.187824 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:30 crc kubenswrapper[4778]: E0318 09:04:30.187908 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.833123 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10" exitCode=0 Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.833235 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10"} Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.839356 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.839537 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.839605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.854339 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.867761 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.877516 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.891097 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.906158 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.916892 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.931045 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.942641 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.958696 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.971601 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.982623 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:30 crc kubenswrapper[4778]: I0318 09:04:30.992387 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.002283 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.013163 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.024653 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.035719 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.055685 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.852999 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5"} Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.858614 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152" exitCode=0 Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.858686 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152"} Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.868295 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.878736 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.896802 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.905057 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.914499 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.931797 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.942986 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.957603 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.967120 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.973786 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.984702 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:31 crc kubenswrapper[4778]: I0318 09:04:31.995465 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.004932 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.019185 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.034379 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.049541 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.059053 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.069450 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.095678 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.108320 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.115428 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.127338 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.134386 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.144966 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.157160 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.167701 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.181070 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.186113 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.186273 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.186324 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:32 crc kubenswrapper[4778]: E0318 09:04:32.186429 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.186835 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:32 crc kubenswrapper[4778]: E0318 09:04:32.186902 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:32 crc kubenswrapper[4778]: E0318 09:04:32.186983 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:32 crc kubenswrapper[4778]: E0318 09:04:32.187067 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.191947 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.200863 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.208810 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.218799 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.228053 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.237455 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.261602 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.868058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.874640 4778 generic.go:334] "Generic (PLEG): container finished" podID="b1698c21-24a7-4338-a0ad-dd110c1ba2f2" containerID="b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792" exitCode=0 Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.874673 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerDied","Data":"b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792"} Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.886155 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.898350 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.911338 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.936184 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.960094 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.971697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.979078 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4778]: I0318 09:04:32.989535 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.000429 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.009279 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.033663 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.060594 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.095608 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.114854 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.131644 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.144697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.162701 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.883850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" event={"ID":"b1698c21-24a7-4338-a0ad-dd110c1ba2f2","Type":"ContainerStarted","Data":"bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb"} Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.888008 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc"} Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.888040 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18"} Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.907376 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.919155 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.937908 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.958947 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.971490 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.985386 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:33 crc kubenswrapper[4778]: I0318 09:04:33.998130 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.022597 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.036882 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.037217 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:05:38.03715355 +0000 UTC m=+204.611898390 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.046383 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.065635 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.092998 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.110587 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.129301 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.138853 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.138911 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.138958 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.138985 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139076 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:05:38.139050201 +0000 UTC m=+204.713795071 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.138996 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139105 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139303 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139358 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139382 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139317 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139460 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139482 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139331 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:05:38.139302888 +0000 UTC m=+204.714047768 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139553 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:05:38.139538304 +0000 UTC m=+204.714283184 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.139593 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:05:38.139577365 +0000 UTC m=+204.714322355 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.146119 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.168328 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.187276 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.187384 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.187307 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.187508 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.187625 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.187779 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.187910 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.188104 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.199921 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.214906 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.229566 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.245851 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.270455 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.291493 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.305625 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.309479 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.309524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.309543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.309563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.309577 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:34Z","lastTransitionTime":"2026-03-18T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.329100 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.334726 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.334762 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.334774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.334790 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.334802 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:34Z","lastTransitionTime":"2026-03-18T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.341717 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.352514 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.357327 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.357362 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.357371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.357385 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.357394 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:34Z","lastTransitionTime":"2026-03-18T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.363332 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.374877 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.378333 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.379963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.380027 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.380045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.380071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.380089 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:34Z","lastTransitionTime":"2026-03-18T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.395380 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.399705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.399732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.399740 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.399755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.399806 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:34Z","lastTransitionTime":"2026-03-18T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.403872 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.414742 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: E0318 09:04:34.414911 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.416033 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.430660 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.452416 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.472091 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.491956 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.513547 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.528011 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.542743 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.556506 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.570945 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.582078 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.598116 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.614628 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.629993 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.659798 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.690018 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.703498 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.715923 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.732800 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.743897 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.755844 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.770187 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.782726 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.797177 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.810952 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.824480 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.898332 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b"} Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.898636 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.917628 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.931792 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.950176 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.956471 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.979322 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:34 crc kubenswrapper[4778]: I0318 09:04:34.996753 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.018495 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.033472 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.048539 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.067277 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.081920 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.098921 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.113890 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.129907 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.143580 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.156057 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.169056 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.181353 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.197526 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.222966 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.239113 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.254820 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.268376 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.284597 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.298275 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.314498 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.355905 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.395473 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.439252 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.483282 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.513982 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.555786 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.622873 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.661810 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.675960 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.904145 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64"} Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.906663 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9f2bp" event={"ID":"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7","Type":"ContainerStarted","Data":"cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7"} Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.907514 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.907568 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.928016 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.938756 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.953294 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.964422 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.978975 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:35 crc kubenswrapper[4778]: I0318 09:04:35.991644 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:35Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.005016 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.022773 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.043773 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.068769 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.086912 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.119175 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.156043 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.186624 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.186675 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.186711 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:36 crc kubenswrapper[4778]: E0318 09:04:36.186766 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.186922 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:36 crc kubenswrapper[4778]: E0318 09:04:36.186915 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:36 crc kubenswrapper[4778]: E0318 09:04:36.187082 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:36 crc kubenswrapper[4778]: E0318 09:04:36.187188 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.194534 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.238050 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.275469 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.320116 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.400425 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.411287 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.443026 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.479681 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.514093 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.564055 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.592960 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.635598 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.672797 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.716495 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.755287 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.793335 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.838547 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.875308 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.914869 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.952282 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:36 crc kubenswrapper[4778]: I0318 09:04:36.997063 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.042046 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:37Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.921921 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/0.log" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.926640 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b" exitCode=1 Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.926709 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b"} Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.927360 4778 scope.go:117] "RemoveContainer" containerID="49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.951526 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:37Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.968826 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:37Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:37 crc kubenswrapper[4778]: I0318 09:04:37.981708 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:37Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.001356 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:37Z\\\",\\\"message\\\":\\\"78 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:37.028846 6778 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 09:04:37.029891 6778 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:37.029908 6778 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:37.028856 6778 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:37.020307 6778 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 09:04:37.030813 6778 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:37.030837 6778 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:37.030884 6778 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:37.030942 6778 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:37.030971 6778 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:37.030942 6778 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:37.031009 6778 factory.go:656] Stopping watch factory\\\\nI0318 09:04:37.031025 6778 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:37.031023 6778 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:37Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.020859 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.039090 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.056675 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.074761 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.088539 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.104876 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.122344 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.136585 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.158639 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.177422 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.187101 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.187114 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.187158 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:38 crc kubenswrapper[4778]: E0318 09:04:38.187277 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.187437 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:38 crc kubenswrapper[4778]: E0318 09:04:38.187423 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:38 crc kubenswrapper[4778]: E0318 09:04:38.187542 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:38 crc kubenswrapper[4778]: E0318 09:04:38.187674 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.191418 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.207098 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.225746 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.934630 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/0.log" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.939555 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b"} Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.940379 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.962894 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:38 crc kubenswrapper[4778]: I0318 09:04:38.986261 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.001265 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:38Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.015697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.036766 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.051988 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.068311 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.082185 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.094938 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.107047 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.119380 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.132340 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.143904 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.166921 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:37Z\\\",\\\"message\\\":\\\"78 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:37.028846 6778 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 09:04:37.029891 6778 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:37.029908 6778 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:37.028856 6778 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:37.020307 6778 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 09:04:37.030813 6778 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:37.030837 6778 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:37.030884 6778 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:37.030942 6778 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:37.030971 6778 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:37.030942 6778 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:37.031009 6778 factory.go:656] Stopping watch factory\\\\nI0318 09:04:37.031025 6778 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:37.031023 6778 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.179841 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.191479 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.204539 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: E0318 09:04:39.293607 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.946803 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/1.log" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.948089 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/0.log" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.953704 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b" exitCode=1 Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.953785 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b"} Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.953848 4778 scope.go:117] "RemoveContainer" containerID="49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.955572 4778 scope.go:117] "RemoveContainer" containerID="7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b" Mar 18 09:04:39 crc kubenswrapper[4778]: E0318 09:04:39.955865 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.972357 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:39 crc kubenswrapper[4778]: I0318 09:04:39.988382 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:39Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.011429 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.027380 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.046091 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.081810 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.098694 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.121566 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.141290 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.159813 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.176792 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.186121 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.186185 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:40 crc kubenswrapper[4778]: E0318 09:04:40.186280 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.186121 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:40 crc kubenswrapper[4778]: E0318 09:04:40.186380 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:40 crc kubenswrapper[4778]: E0318 09:04:40.186463 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.186598 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:40 crc kubenswrapper[4778]: E0318 09:04:40.186668 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.191304 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.204743 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.219767 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.261031 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49f0c864280147b0bcb60d53489d702f3ec1038f779e7dfedff59e83a799651b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:37Z\\\",\\\"message\\\":\\\"78 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:37.028846 6778 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 09:04:37.029891 6778 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:37.029908 6778 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:37.028856 6778 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:37.020307 6778 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 09:04:37.030813 6778 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:37.030837 6778 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:37.030884 6778 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:37.030942 6778 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:37.030971 6778 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:37.030942 6778 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:37.031009 6778 factory.go:656] Stopping watch factory\\\\nI0318 09:04:37.031025 6778 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:37.031023 6778 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.279925 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.299982 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.960135 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/1.log" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.963397 4778 scope.go:117] "RemoveContainer" containerID="7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b" Mar 18 09:04:40 crc kubenswrapper[4778]: E0318 09:04:40.963544 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.978501 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:40 crc kubenswrapper[4778]: I0318 09:04:40.993748 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:40Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.008696 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.023527 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.040251 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.052670 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.064754 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.077916 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.096731 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.117752 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.130574 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.158643 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.175598 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.188103 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.207685 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.220120 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:41 crc kubenswrapper[4778]: I0318 09:04:41.233232 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:42 crc kubenswrapper[4778]: I0318 09:04:42.186270 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:42 crc kubenswrapper[4778]: I0318 09:04:42.186406 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:42 crc kubenswrapper[4778]: E0318 09:04:42.187008 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:42 crc kubenswrapper[4778]: I0318 09:04:42.186441 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:42 crc kubenswrapper[4778]: I0318 09:04:42.186412 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:42 crc kubenswrapper[4778]: E0318 09:04:42.187131 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:42 crc kubenswrapper[4778]: E0318 09:04:42.187276 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:42 crc kubenswrapper[4778]: E0318 09:04:42.187465 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:43 crc kubenswrapper[4778]: I0318 09:04:43.204007 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.186781 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.186852 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.186918 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.186944 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.186985 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.187141 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.187559 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.187897 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.207771 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.226807 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.245737 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.276425 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.291593 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.294184 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.326395 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.343613 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.355243 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.368014 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.381602 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.399038 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.418715 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.435027 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.452706 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.469719 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.485442 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.498888 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.510545 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.579820 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.580023 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.580142 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:05:16.580108806 +0000 UTC m=+183.154853686 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.812702 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.812815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.812833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.813394 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.813565 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:44Z","lastTransitionTime":"2026-03-18T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.834590 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.839643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.839698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.839720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.839751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.839773 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:44Z","lastTransitionTime":"2026-03-18T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.861341 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.866510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.866567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.866589 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.866621 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.866644 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:44Z","lastTransitionTime":"2026-03-18T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.893802 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.900241 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.900295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.900314 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.900340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.900357 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:44Z","lastTransitionTime":"2026-03-18T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.915289 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.920384 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.920453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.920482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.920515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:44 crc kubenswrapper[4778]: I0318 09:04:44.920537 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:44Z","lastTransitionTime":"2026-03-18T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.937595 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:44 crc kubenswrapper[4778]: E0318 09:04:44.937743 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:46 crc kubenswrapper[4778]: I0318 09:04:46.186387 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:46 crc kubenswrapper[4778]: I0318 09:04:46.186488 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:46 crc kubenswrapper[4778]: I0318 09:04:46.186491 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:46 crc kubenswrapper[4778]: E0318 09:04:46.186575 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:46 crc kubenswrapper[4778]: E0318 09:04:46.186738 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:46 crc kubenswrapper[4778]: I0318 09:04:46.186828 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:46 crc kubenswrapper[4778]: E0318 09:04:46.186929 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:46 crc kubenswrapper[4778]: E0318 09:04:46.187014 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:48 crc kubenswrapper[4778]: I0318 09:04:48.186170 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:48 crc kubenswrapper[4778]: I0318 09:04:48.186251 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:48 crc kubenswrapper[4778]: I0318 09:04:48.186311 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:48 crc kubenswrapper[4778]: I0318 09:04:48.186335 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:48 crc kubenswrapper[4778]: E0318 09:04:48.187546 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:48 crc kubenswrapper[4778]: E0318 09:04:48.187675 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:48 crc kubenswrapper[4778]: E0318 09:04:48.187843 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:48 crc kubenswrapper[4778]: E0318 09:04:48.187995 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:49 crc kubenswrapper[4778]: I0318 09:04:49.201522 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 09:04:49 crc kubenswrapper[4778]: E0318 09:04:49.295769 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:50 crc kubenswrapper[4778]: I0318 09:04:50.186929 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:50 crc kubenswrapper[4778]: I0318 09:04:50.187067 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:50 crc kubenswrapper[4778]: I0318 09:04:50.187088 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:50 crc kubenswrapper[4778]: E0318 09:04:50.187297 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:50 crc kubenswrapper[4778]: I0318 09:04:50.187334 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:50 crc kubenswrapper[4778]: E0318 09:04:50.187463 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:50 crc kubenswrapper[4778]: E0318 09:04:50.187603 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:50 crc kubenswrapper[4778]: E0318 09:04:50.187714 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:52 crc kubenswrapper[4778]: I0318 09:04:52.186934 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:52 crc kubenswrapper[4778]: I0318 09:04:52.187005 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:52 crc kubenswrapper[4778]: E0318 09:04:52.187170 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:52 crc kubenswrapper[4778]: I0318 09:04:52.187005 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:52 crc kubenswrapper[4778]: E0318 09:04:52.187364 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:52 crc kubenswrapper[4778]: E0318 09:04:52.187245 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:52 crc kubenswrapper[4778]: I0318 09:04:52.186998 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:52 crc kubenswrapper[4778]: E0318 09:04:52.187479 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.186106 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.186229 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.186313 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.186421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:54 crc kubenswrapper[4778]: E0318 09:04:54.186419 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:54 crc kubenswrapper[4778]: E0318 09:04:54.186735 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:54 crc kubenswrapper[4778]: E0318 09:04:54.186908 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:54 crc kubenswrapper[4778]: E0318 09:04:54.187036 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.208361 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.224588 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.239275 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.252566 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.262760 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.284388 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: E0318 09:04:54.296597 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.300972 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.316257 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.330314 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.355142 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.367457 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.381330 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.393535 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.405309 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.418107 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.433010 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.446049 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.458681 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:54 crc kubenswrapper[4778]: I0318 09:04:54.472592 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.073295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.073714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.073948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.074437 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.074855 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:55Z","lastTransitionTime":"2026-03-18T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.094775 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:55Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.099934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.099984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.099998 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.100022 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.100039 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:55Z","lastTransitionTime":"2026-03-18T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.112909 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:55Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.117937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.117972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.117983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.117997 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.118031 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:55Z","lastTransitionTime":"2026-03-18T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.130643 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:55Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.135829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.136074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.136281 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.136443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.137027 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:55Z","lastTransitionTime":"2026-03-18T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.156937 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:55Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.162658 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.162921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.163064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.163240 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:04:55 crc kubenswrapper[4778]: I0318 09:04:55.163400 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:04:55Z","lastTransitionTime":"2026-03-18T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.180259 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:55Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:55 crc kubenswrapper[4778]: E0318 09:04:55.180905 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:04:56 crc kubenswrapper[4778]: I0318 09:04:56.186756 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:56 crc kubenswrapper[4778]: I0318 09:04:56.186810 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:56 crc kubenswrapper[4778]: I0318 09:04:56.187180 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:56 crc kubenswrapper[4778]: I0318 09:04:56.187228 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:56 crc kubenswrapper[4778]: E0318 09:04:56.187403 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:56 crc kubenswrapper[4778]: I0318 09:04:56.187515 4778 scope.go:117] "RemoveContainer" containerID="7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b" Mar 18 09:04:56 crc kubenswrapper[4778]: E0318 09:04:56.187618 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:56 crc kubenswrapper[4778]: E0318 09:04:56.187764 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:56 crc kubenswrapper[4778]: E0318 09:04:56.187504 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.025938 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/1.log" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.030230 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80"} Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.031043 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.056500 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.075779 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.091552 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.120005 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.136740 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.154784 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.171992 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.189129 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.204358 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.221651 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.238179 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.252981 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.269566 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.282178 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.295900 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.309188 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.321739 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.342387 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:57 crc kubenswrapper[4778]: I0318 09:04:57.355714 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.037830 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/2.log" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.039045 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/1.log" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.043530 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" exitCode=1 Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.043576 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80"} Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.043616 4778 scope.go:117] "RemoveContainer" containerID="7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.044861 4778 scope.go:117] "RemoveContainer" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" Mar 18 09:04:58 crc kubenswrapper[4778]: E0318 09:04:58.045160 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.066828 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.088028 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.110267 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.145129 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f24f9a7e70cb4d25897d331c077d20c49833bcee65bc78287e41983469e326b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:38Z\\\",\\\"message\\\":\\\"0] Sending *v1.Pod event handler 3 for removal\\\\nI0318 09:04:38.953123 6929 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 09:04:38.953789 6929 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 09:04:38.953820 6929 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 09:04:38.953837 6929 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 09:04:38.955533 6929 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 09:04:38.955636 6929 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 09:04:38.955712 6929 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:38.955811 6929 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:38.955825 6929 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:38.955855 6929 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 09:04:38.955888 6929 factory.go:656] Stopping watch factory\\\\nI0318 09:04:38.955909 6929 ovnkube.go:599] Stopped ovnkube\\\\nI0318 09:04:38.955921 6929 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:38.956003 6929 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:38.956027 6929 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 09:04:38.956134 6929 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.171821 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.186730 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.186847 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:04:58 crc kubenswrapper[4778]: E0318 09:04:58.186921 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.186847 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:04:58 crc kubenswrapper[4778]: E0318 09:04:58.187076 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.187160 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:04:58 crc kubenswrapper[4778]: E0318 09:04:58.187302 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:04:58 crc kubenswrapper[4778]: E0318 09:04:58.187374 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.189556 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.209164 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.233007 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.268400 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.284810 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.299731 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.320036 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.337170 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.356354 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.374323 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.392410 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.408413 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.425796 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:58 crc kubenswrapper[4778]: I0318 09:04:58.438680 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:58Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.050828 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/2.log" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.057595 4778 scope.go:117] "RemoveContainer" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" Mar 18 09:04:59 crc kubenswrapper[4778]: E0318 09:04:59.058005 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.075448 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.090416 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.106616 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.124997 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.142996 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.164147 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.179706 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.195807 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.206915 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.233917 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.253345 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.266608 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.286498 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: E0318 09:04:59.298136 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.301362 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.314558 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.326059 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.337734 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.350672 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:04:59 crc kubenswrapper[4778]: I0318 09:04:59.363457 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:04:59Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:00 crc kubenswrapper[4778]: I0318 09:05:00.187245 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:00 crc kubenswrapper[4778]: I0318 09:05:00.187289 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:00 crc kubenswrapper[4778]: I0318 09:05:00.187285 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:00 crc kubenswrapper[4778]: E0318 09:05:00.187478 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:00 crc kubenswrapper[4778]: E0318 09:05:00.187593 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:00 crc kubenswrapper[4778]: E0318 09:05:00.187697 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:00 crc kubenswrapper[4778]: I0318 09:05:00.187857 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:00 crc kubenswrapper[4778]: E0318 09:05:00.187974 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:02 crc kubenswrapper[4778]: I0318 09:05:02.186632 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:02 crc kubenswrapper[4778]: I0318 09:05:02.186701 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:02 crc kubenswrapper[4778]: E0318 09:05:02.186861 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:02 crc kubenswrapper[4778]: I0318 09:05:02.186887 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:02 crc kubenswrapper[4778]: I0318 09:05:02.186886 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:02 crc kubenswrapper[4778]: E0318 09:05:02.187074 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:02 crc kubenswrapper[4778]: E0318 09:05:02.187166 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:02 crc kubenswrapper[4778]: E0318 09:05:02.187422 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.187121 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:04 crc kubenswrapper[4778]: E0318 09:05:04.187339 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.187441 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.187441 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:04 crc kubenswrapper[4778]: E0318 09:05:04.187617 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:04 crc kubenswrapper[4778]: E0318 09:05:04.187839 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.188689 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:04 crc kubenswrapper[4778]: E0318 09:05:04.189007 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.208542 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.231994 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.250876 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.267799 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.288642 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: E0318 09:05:04.298910 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.308310 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.327082 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.339053 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.353283 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.381517 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.398372 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.412478 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.427271 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.443172 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.459928 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.471457 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.484834 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.501045 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:04 crc kubenswrapper[4778]: I0318 09:05:04.533020 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:04Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.283010 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.283068 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.283080 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.283101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.283113 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:05Z","lastTransitionTime":"2026-03-18T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.306015 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:05Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.312688 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.312750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.312770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.312796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.312814 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:05Z","lastTransitionTime":"2026-03-18T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.327646 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:05Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.332330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.332376 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.332388 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.332399 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.332407 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:05Z","lastTransitionTime":"2026-03-18T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.345632 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:05Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.349243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.349298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.349317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.349341 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.349361 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:05Z","lastTransitionTime":"2026-03-18T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.363241 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:05Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.368925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.368973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.368984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.369003 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:05 crc kubenswrapper[4778]: I0318 09:05:05.369016 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:05Z","lastTransitionTime":"2026-03-18T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.390048 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:05Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:05 crc kubenswrapper[4778]: E0318 09:05:05.390439 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:05:06 crc kubenswrapper[4778]: I0318 09:05:06.186513 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:06 crc kubenswrapper[4778]: E0318 09:05:06.186663 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:06 crc kubenswrapper[4778]: I0318 09:05:06.186764 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:06 crc kubenswrapper[4778]: E0318 09:05:06.186898 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:06 crc kubenswrapper[4778]: I0318 09:05:06.187108 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:06 crc kubenswrapper[4778]: I0318 09:05:06.187136 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:06 crc kubenswrapper[4778]: E0318 09:05:06.187234 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:06 crc kubenswrapper[4778]: E0318 09:05:06.187388 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:08 crc kubenswrapper[4778]: I0318 09:05:08.186610 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:08 crc kubenswrapper[4778]: I0318 09:05:08.186735 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:08 crc kubenswrapper[4778]: E0318 09:05:08.186782 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:08 crc kubenswrapper[4778]: I0318 09:05:08.186836 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:08 crc kubenswrapper[4778]: E0318 09:05:08.186943 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:08 crc kubenswrapper[4778]: I0318 09:05:08.186856 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:08 crc kubenswrapper[4778]: E0318 09:05:08.187080 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:08 crc kubenswrapper[4778]: E0318 09:05:08.187288 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:09 crc kubenswrapper[4778]: E0318 09:05:09.300067 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:10 crc kubenswrapper[4778]: I0318 09:05:10.186736 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:10 crc kubenswrapper[4778]: I0318 09:05:10.186762 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:10 crc kubenswrapper[4778]: I0318 09:05:10.186835 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:10 crc kubenswrapper[4778]: I0318 09:05:10.186890 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:10 crc kubenswrapper[4778]: E0318 09:05:10.187149 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:10 crc kubenswrapper[4778]: E0318 09:05:10.187342 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:10 crc kubenswrapper[4778]: E0318 09:05:10.187534 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:10 crc kubenswrapper[4778]: E0318 09:05:10.187741 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:12 crc kubenswrapper[4778]: I0318 09:05:12.186879 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:12 crc kubenswrapper[4778]: I0318 09:05:12.186949 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:12 crc kubenswrapper[4778]: I0318 09:05:12.187048 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:12 crc kubenswrapper[4778]: E0318 09:05:12.187103 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:12 crc kubenswrapper[4778]: E0318 09:05:12.187309 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:12 crc kubenswrapper[4778]: I0318 09:05:12.187328 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:12 crc kubenswrapper[4778]: E0318 09:05:12.187438 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:12 crc kubenswrapper[4778]: E0318 09:05:12.187753 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.114322 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/0.log" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.114424 4778 generic.go:334] "Generic (PLEG): container finished" podID="dce973f3-25e6-4536-87cc-9b46499ad7cf" containerID="27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5" exitCode=1 Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.114480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerDied","Data":"27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5"} Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.115244 4778 scope.go:117] "RemoveContainer" containerID="27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.145862 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.159818 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.175346 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.186369 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.186432 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.186526 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.186554 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.186758 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.186823 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.187817 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.188011 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.188111 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.188469 4778 scope.go:117] "RemoveContainer" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.188778 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.202857 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.221210 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.237568 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.250717 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.265475 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.280690 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: E0318 09:05:14.300664 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.303268 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.317985 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.329801 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.347655 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.361306 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.377289 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.391129 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.436735 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.478382 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.491376 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.514420 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.528255 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.539725 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.559628 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.570320 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.582411 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.595625 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.610795 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.625052 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.646554 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.673744 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.687762 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.704381 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.717967 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.731734 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.749132 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.766925 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:14 crc kubenswrapper[4778]: I0318 09:05:14.790962 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.122898 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/0.log" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.123398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerStarted","Data":"3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562"} Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.151380 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.173418 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.197892 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.231043 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.248820 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.269708 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.287429 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.322994 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.344799 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.364470 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.390571 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.412630 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.438076 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.461007 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.481789 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.505628 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.529742 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.551324 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.571492 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.730718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.730968 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.731094 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.731128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.731147 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:15Z","lastTransitionTime":"2026-03-18T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.755644 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.760638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.760721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.760741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.760781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.760852 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:15Z","lastTransitionTime":"2026-03-18T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.784117 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.790443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.790496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.790509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.790527 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.790540 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:15Z","lastTransitionTime":"2026-03-18T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.810765 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.816486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.816541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.816553 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.816573 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.816588 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:15Z","lastTransitionTime":"2026-03-18T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.835671 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.840329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.840389 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.840409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.840432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:15 crc kubenswrapper[4778]: I0318 09:05:15.840450 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:15Z","lastTransitionTime":"2026-03-18T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.855702 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:15Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:15 crc kubenswrapper[4778]: E0318 09:05:15.855948 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:05:16 crc kubenswrapper[4778]: I0318 09:05:16.187041 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:16 crc kubenswrapper[4778]: I0318 09:05:16.187131 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:16 crc kubenswrapper[4778]: I0318 09:05:16.187247 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.187297 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:16 crc kubenswrapper[4778]: I0318 09:05:16.187329 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.187645 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.187566 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.188347 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:16 crc kubenswrapper[4778]: I0318 09:05:16.654472 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.654689 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:05:16 crc kubenswrapper[4778]: E0318 09:05:16.654776 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:20.654750949 +0000 UTC m=+247.229495829 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:05:18 crc kubenswrapper[4778]: I0318 09:05:18.187479 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:18 crc kubenswrapper[4778]: I0318 09:05:18.187680 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:18 crc kubenswrapper[4778]: I0318 09:05:18.187746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:18 crc kubenswrapper[4778]: E0318 09:05:18.187905 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:18 crc kubenswrapper[4778]: I0318 09:05:18.188257 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:18 crc kubenswrapper[4778]: E0318 09:05:18.188593 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:18 crc kubenswrapper[4778]: E0318 09:05:18.188720 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:18 crc kubenswrapper[4778]: E0318 09:05:18.188830 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:19 crc kubenswrapper[4778]: E0318 09:05:19.301803 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:20 crc kubenswrapper[4778]: I0318 09:05:20.187515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:20 crc kubenswrapper[4778]: I0318 09:05:20.187683 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:20 crc kubenswrapper[4778]: I0318 09:05:20.187822 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:20 crc kubenswrapper[4778]: E0318 09:05:20.187829 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:20 crc kubenswrapper[4778]: I0318 09:05:20.187863 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:20 crc kubenswrapper[4778]: E0318 09:05:20.187989 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:20 crc kubenswrapper[4778]: E0318 09:05:20.188107 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:20 crc kubenswrapper[4778]: E0318 09:05:20.188228 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:22 crc kubenswrapper[4778]: I0318 09:05:22.187165 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:22 crc kubenswrapper[4778]: I0318 09:05:22.187248 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:22 crc kubenswrapper[4778]: E0318 09:05:22.187503 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:22 crc kubenswrapper[4778]: I0318 09:05:22.187569 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:22 crc kubenswrapper[4778]: I0318 09:05:22.187595 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:22 crc kubenswrapper[4778]: E0318 09:05:22.187794 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:22 crc kubenswrapper[4778]: E0318 09:05:22.187876 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:22 crc kubenswrapper[4778]: E0318 09:05:22.188179 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.186359 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.186515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.186584 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:24 crc kubenswrapper[4778]: E0318 09:05:24.186749 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.186777 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:24 crc kubenswrapper[4778]: E0318 09:05:24.186951 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:24 crc kubenswrapper[4778]: E0318 09:05:24.187104 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:24 crc kubenswrapper[4778]: E0318 09:05:24.187270 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.207843 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.224681 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.252298 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.267502 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.288530 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: E0318 09:05:24.302595 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.311279 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.328163 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.348045 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.368084 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.386700 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.408650 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.422649 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.435977 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.450483 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.463061 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.481050 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.494876 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.514515 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:24 crc kubenswrapper[4778]: I0318 09:05:24.534748 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.194992 4778 scope.go:117] "RemoveContainer" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.916428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.916875 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.916885 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.916905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.916918 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:25Z","lastTransitionTime":"2026-03-18T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.930111 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:25Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.934092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.934167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.934181 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.934240 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.934269 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:25Z","lastTransitionTime":"2026-03-18T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.947654 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:25Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.951398 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.951453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.951472 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.951490 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.951502 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:25Z","lastTransitionTime":"2026-03-18T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.965184 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:25Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.970070 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.970120 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.970136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.970158 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.970172 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:25Z","lastTransitionTime":"2026-03-18T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.983044 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:25Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.986552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.986584 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.986594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.986610 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:25 crc kubenswrapper[4778]: I0318 09:05:25.986622 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:25Z","lastTransitionTime":"2026-03-18T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.997923 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:25Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:25 crc kubenswrapper[4778]: E0318 09:05:25.998060 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.168458 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/2.log" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.172300 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.172899 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.187168 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.187294 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.187204 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.187293 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:26 crc kubenswrapper[4778]: E0318 09:05:26.187393 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:26 crc kubenswrapper[4778]: E0318 09:05:26.187514 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:26 crc kubenswrapper[4778]: E0318 09:05:26.187570 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:26 crc kubenswrapper[4778]: E0318 09:05:26.187674 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.193647 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.210852 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.226686 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.245779 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.257762 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.276189 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.290024 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.312915 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.333901 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.349785 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.374800 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.392844 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.415507 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.430910 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.446107 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.460646 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.477072 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.488681 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:26 crc kubenswrapper[4778]: I0318 09:05:26.503409 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.180478 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.181718 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/2.log" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.185289 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" exitCode=1 Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.185340 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.185378 4778 scope.go:117] "RemoveContainer" containerID="355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.186331 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:05:27 crc kubenswrapper[4778]: E0318 09:05:27.186573 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.205401 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.225499 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.246971 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.262008 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.281549 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.297954 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.314572 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.331043 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.350348 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.372136 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.393615 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.427134 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355bceadf14f2aea516a90a408587ffd2363b6059ff9456233b321d63b305a80\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:04:57Z\\\",\\\"message\\\":\\\"o:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 09:04:57.221650 7125 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 09:04:57.221665 7125 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 09:04:57.221669 7125 factory.go:656] Stopping watch factory\\\\nI0318 09:04:57.221665 7125 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221679 7125 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 09:04:57.221698 7125 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 09:04:57.221698 7125 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 09:04:57.221707 7125 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 09:04:57.221729 7125 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 09:04:57.221785 7125 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.221869 7125 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 09:04:57.222116 7125 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:26Z\\\",\\\"message\\\":\\\"Distribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0318 09:05:26.271548 7451 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-56rc7\\\\nF0318 09:05:26.271567 7451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z]\\\\nI0318 09:05:26.271448 7451 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-9f2bp\\\\nI0318 09:05:26.271585 7451 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-9f2bp in node crc\\\\nI0318 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.442292 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.455906 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.467410 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.491721 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.509599 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.523993 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:27 crc kubenswrapper[4778]: I0318 09:05:27.542409 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.186656 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.186711 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.186751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.186668 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:28 crc kubenswrapper[4778]: E0318 09:05:28.186893 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:28 crc kubenswrapper[4778]: E0318 09:05:28.187024 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:28 crc kubenswrapper[4778]: E0318 09:05:28.187177 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:28 crc kubenswrapper[4778]: E0318 09:05:28.187514 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.191780 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.196982 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:05:28 crc kubenswrapper[4778]: E0318 09:05:28.197366 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.218794 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.234137 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.253632 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.273791 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.292133 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.310009 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.325642 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.343653 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.363432 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.380321 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.408861 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:26Z\\\",\\\"message\\\":\\\"Distribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0318 09:05:26.271548 7451 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-56rc7\\\\nF0318 09:05:26.271567 7451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z]\\\\nI0318 09:05:26.271448 7451 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-9f2bp\\\\nI0318 09:05:26.271585 7451 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-9f2bp in node crc\\\\nI0318 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:05:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.427474 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.451779 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.467879 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.483393 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.512826 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.525803 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.539043 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:28 crc kubenswrapper[4778]: I0318 09:05:28.553329 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:29 crc kubenswrapper[4778]: E0318 09:05:29.304795 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:30 crc kubenswrapper[4778]: I0318 09:05:30.186276 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:30 crc kubenswrapper[4778]: E0318 09:05:30.186472 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:30 crc kubenswrapper[4778]: I0318 09:05:30.186792 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:30 crc kubenswrapper[4778]: E0318 09:05:30.186884 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:30 crc kubenswrapper[4778]: I0318 09:05:30.187115 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:30 crc kubenswrapper[4778]: E0318 09:05:30.187268 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:30 crc kubenswrapper[4778]: I0318 09:05:30.187527 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:30 crc kubenswrapper[4778]: E0318 09:05:30.187647 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:32 crc kubenswrapper[4778]: I0318 09:05:32.187283 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:32 crc kubenswrapper[4778]: I0318 09:05:32.187315 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:32 crc kubenswrapper[4778]: E0318 09:05:32.187795 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:32 crc kubenswrapper[4778]: I0318 09:05:32.187372 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:32 crc kubenswrapper[4778]: I0318 09:05:32.187356 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:32 crc kubenswrapper[4778]: E0318 09:05:32.187986 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:32 crc kubenswrapper[4778]: E0318 09:05:32.188136 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:32 crc kubenswrapper[4778]: E0318 09:05:32.188338 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.186753 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.186825 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:34 crc kubenswrapper[4778]: E0318 09:05:34.187002 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.187037 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.187082 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:34 crc kubenswrapper[4778]: E0318 09:05:34.187236 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:34 crc kubenswrapper[4778]: E0318 09:05:34.187383 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:34 crc kubenswrapper[4778]: E0318 09:05:34.187594 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.208109 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.227288 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.244832 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.258961 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.284166 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: E0318 09:05:34.305460 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.309698 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.325417 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.343900 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.358829 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.377653 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.394718 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.410905 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.433300 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.447951 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.461234 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.479538 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.499327 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.513142 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:34 crc kubenswrapper[4778]: I0318 09:05:34.542238 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:26Z\\\",\\\"message\\\":\\\"Distribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0318 09:05:26.271548 7451 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-56rc7\\\\nF0318 09:05:26.271567 7451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z]\\\\nI0318 09:05:26.271448 7451 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-9f2bp\\\\nI0318 09:05:26.271585 7451 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-9f2bp in node crc\\\\nI0318 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:05:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.186789 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.186862 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.186991 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.187164 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.187437 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.187534 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.187729 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.187823 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.348542 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.348611 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.348635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.348665 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.348689 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:36Z","lastTransitionTime":"2026-03-18T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.371705 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.375892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.375932 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.375942 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.375959 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.375972 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:36Z","lastTransitionTime":"2026-03-18T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.392019 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.396722 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.396773 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.396791 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.396814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.396829 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:36Z","lastTransitionTime":"2026-03-18T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.413896 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.418921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.419068 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.419145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.419266 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.419363 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:36Z","lastTransitionTime":"2026-03-18T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.434701 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.439518 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.439587 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.439599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.439616 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:36 crc kubenswrapper[4778]: I0318 09:05:36.439630 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:36Z","lastTransitionTime":"2026-03-18T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.456721 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:36Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:36 crc kubenswrapper[4778]: E0318 09:05:36.456891 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.108593 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.108822 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:07:40.108782209 +0000 UTC m=+326.683527059 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.186480 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.186550 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.186618 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.186729 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.186780 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.186942 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.187098 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.187322 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.210555 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.210616 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.210652 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:38 crc kubenswrapper[4778]: I0318 09:05:38.210675 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210745 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210833 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210869 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210881 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210889 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210924 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:07:40.210886645 +0000 UTC m=+326.785631515 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210964 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 09:07:40.210941386 +0000 UTC m=+326.785686236 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.210989 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.211050 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.211001 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 09:07:40.210987367 +0000 UTC m=+326.785732217 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.211076 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:05:38 crc kubenswrapper[4778]: E0318 09:05:38.211251 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 09:07:40.211161452 +0000 UTC m=+326.785906412 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 09:05:39 crc kubenswrapper[4778]: E0318 09:05:39.307142 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:40 crc kubenswrapper[4778]: I0318 09:05:40.186585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:40 crc kubenswrapper[4778]: I0318 09:05:40.186662 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:40 crc kubenswrapper[4778]: I0318 09:05:40.186612 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:40 crc kubenswrapper[4778]: I0318 09:05:40.186840 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:40 crc kubenswrapper[4778]: E0318 09:05:40.186826 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:40 crc kubenswrapper[4778]: E0318 09:05:40.187044 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:40 crc kubenswrapper[4778]: E0318 09:05:40.187182 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:40 crc kubenswrapper[4778]: E0318 09:05:40.187278 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:42 crc kubenswrapper[4778]: I0318 09:05:42.186903 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:42 crc kubenswrapper[4778]: I0318 09:05:42.186924 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:42 crc kubenswrapper[4778]: I0318 09:05:42.187924 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:42 crc kubenswrapper[4778]: E0318 09:05:42.188004 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:42 crc kubenswrapper[4778]: I0318 09:05:42.188558 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:42 crc kubenswrapper[4778]: E0318 09:05:42.189264 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:42 crc kubenswrapper[4778]: E0318 09:05:42.189106 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:42 crc kubenswrapper[4778]: I0318 09:05:42.189111 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:05:42 crc kubenswrapper[4778]: E0318 09:05:42.189617 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:42 crc kubenswrapper[4778]: E0318 09:05:42.189659 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.186859 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.186921 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.186921 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:44 crc kubenswrapper[4778]: E0318 09:05:44.187133 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.187164 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:44 crc kubenswrapper[4778]: E0318 09:05:44.187341 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:44 crc kubenswrapper[4778]: E0318 09:05:44.187541 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:44 crc kubenswrapper[4778]: E0318 09:05:44.187685 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.204934 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dca8196e6d03b68a89660f715395d3bca14ef08967c4085125f4fa9b3452ed64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.219873 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9d88r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9bc7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.239968 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.257629 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.278740 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r2lvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dce973f3-25e6-4536-87cc-9b46499ad7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:13Z\\\",\\\"message\\\":\\\"2026-03-18T09:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd\\\\n2026-03-18T09:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0635a6a1-baf5-4f2d-8765-c0a7556b93cd to /host/opt/cni/bin/\\\\n2026-03-18T09:04:28Z [verbose] multus-daemon started\\\\n2026-03-18T09:04:28Z [verbose] Readiness Indicator file check\\\\n2026-03-18T09:05:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2zsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r2lvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.300460 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T09:05:26Z\\\",\\\"message\\\":\\\"Distribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0318 09:05:26.271548 7451 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-56rc7\\\\nF0318 09:05:26.271567 7451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:26Z is after 2025-08-24T17:21:41Z]\\\\nI0318 09:05:26.271448 7451 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-9f2bp\\\\nI0318 09:05:26.271585 7451 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-9f2bp in node crc\\\\nI0318 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:05:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8g6f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-g2qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: E0318 09:05:44.307752 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.325217 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1698c21-24a7-4338-a0ad-dd110c1ba2f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf594d28f04f518fe72e8349eebd762fd2f72da4ce3771d2a6143fb0c6be9bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55bf87b791198431bf7efcd5dc49993f7d88f55ff5371febf3f612b3b6ebcb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b099cf22393e1da8267b58f5dbbf13f670f401d2ba21c261d220b89b24a006d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ee71ed66ff286139b2d5c941c2299ed61d62dd2b3b0eba639672ff0febd5917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://763278175ac8b13fa6df6485db7dde405e99456baf049986cf0d07b50bd42f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76465f671277b48a9361bec05017dc9b492ec648e2f529d191e88438818a4152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17649f3e54c3f8e50178420512b164353c4e36b5de96e94ece75a968e0cd792\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxhf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkfx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.343389 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9f2bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b256e9-a9ba-4e2e-9a39-6d9ffa7fa6b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbf0824b5e927aedb2c575e36d7b9ba573b9543964a39768aeecc3ef64abfb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grrhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9f2bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.358841 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19777429-4133-4e70-b2dd-c61c54abdec4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b800c57dd101ad1a48491796a903d39be130fe0117d453473db1153f29ac0888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33add3a5cad7fe243413d97b70b07088bc9ebbb113863e3b826b62f497a3caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b6q8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:04:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7262f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.372375 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae8c97a-b72b-4ce1-9fab-e31daef5bf55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6289bd0ed9e644b85cc4634d2d5d3bfbc2362a7e1912d6fdf22463ba837e747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb09e5d9edd89d4afb34a690a6188193747d301df5b36471ff012ac0f70008b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.405156 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fbb63fd-7a74-4d17-9f7a-fa5ebce10b30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba23fc7ad35eaacd9b4647ef971095fa61d34011a5bb3fee5efe676f5ff644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068cd3d6f498da8ee0c90d9e1f9fbd2f179046a39b95e2d1eab298cc2ad9f9f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b30b8fb877664f44bbd966d172e8838ba75589ecbde3ae12da7dc00e41bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a49fb6e09c22bdeabc5e206c57c1830d4ec8c15800cd0a4bae200d6f487781b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb8ccf887169b08ebaf330447cdee7fbc42140f05e0b6e71d5e36d8232f15b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e52d5fef3658968828fa249e8bd44b02f2dfac15e2d95c5614d8d230acc2d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfe7cfcd0c5637aad68327cfcd8a83ea96d9b95efedd61e2510831c77e8fc988\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae2e9766b65cc00ac946bcc19f90c152417cd02b11ef17c4efd20b01e974e6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.420359 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.433146 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dfnnp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8cf64307-e191-476a-902b-93001adc0b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f295d9f5894fb7eb385c41016b4fd65c0f56330cbb680fa28ca153df07a3f627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9b2b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dfnnp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.449524 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dbde12e15f43ab732f5e3cc32ba3a5793317b090ca26e96313c779e39f842cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2acfbe4abb85fd86947c90c18e0fc40c30fdec92ba7e7d2b81f36740f30b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.461938 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7243f983-24d5-48ef-858b-5f4049a82acc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://401a0548dd984f7a637ff163528956bfefd3fa6597c1cb29ddc18fb111d17e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjjcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56rc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.480671 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38043922-fba8-4439-b469-508c00992f80\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:13Z\\\",\\\"message\\\":\\\"W0318 09:03:12.531589 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0318 09:03:12.532804 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773824592 cert, and key in /tmp/serving-cert-2042145757/serving-signer.crt, /tmp/serving-cert-2042145757/serving-signer.key\\\\nI0318 09:03:12.860274 1 observer_polling.go:159] Starting file observer\\\\nW0318 09:03:12.873768 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0318 09:03:12.874014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 09:03:12.875230 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2042145757/tls.crt::/tmp/serving-cert-2042145757/tls.key\\\\\\\"\\\\nF0318 09:03:13.091864 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:03:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.497388 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7985d1b-ec3f-4c66-98a0-cfbd4b6d675a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e49b35275fa0711d052226da8cec83f8f0a90ddc7be0e71fde299258dac37d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b968b172821adbcd255509a1ce1bf7e166134c1a8ed01c9e39043d4e6d5bd5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T09:03:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 09:02:45.579135 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 09:02:45.580904 1 observer_polling.go:159] Starting file observer\\\\nI0318 09:02:45.582937 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 09:02:45.585276 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 09:03:14.022603 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0318 09:03:15.111024 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 09:03:15.111188 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:03:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c90fb5fa40033e844bf87f3fa5c84d7ca69bce287ff5790864d56b0729a796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae666ff0d0900021900f129b26be232d87404e9997d575438dd7ba79b113526d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.512316 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72b83842-c3ea-43de-80d9-d5fe34b9a45b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T09:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cee1163b8aa6fe47c41ea19bf47801948a0fa94344fb8d1073725b45b770ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255db0ff08f6e32a4629dda271eb5ed3381a7a4ceda6e1844632b09038bad88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af62365a1d585a3ba8c28ea1d7cecd1536dff8afd05e42cd4dd4b18507e348ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b1f7f73b354588d6ec1af81ca08cafc8747befeef89d9d7d732b6784ab7cce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T09:02:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T09:02:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T09:02:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:44 crc kubenswrapper[4778]: I0318 09:05:44.528581 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:03:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T09:04:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023a41f8522d05e838f06481ecc039a20721553c1baf22c36c0d600977d579a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.186572 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.186592 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.186689 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.186775 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.186993 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.187182 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.187421 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.187622 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.583234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.583298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.583322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.583353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.583376 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:46Z","lastTransitionTime":"2026-03-18T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.603989 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:46Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.609760 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.609805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.609814 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.609828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.609839 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:46Z","lastTransitionTime":"2026-03-18T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.627609 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:46Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.632825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.632895 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.632909 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.632934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.632951 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:46Z","lastTransitionTime":"2026-03-18T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.648474 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:46Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.652828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.652861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.652872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.652890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.652900 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:46Z","lastTransitionTime":"2026-03-18T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.669244 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:46Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.673914 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.673963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.673974 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.673991 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:46 crc kubenswrapper[4778]: I0318 09:05:46.674006 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:46Z","lastTransitionTime":"2026-03-18T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.686656 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T09:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09c4ac70-7aed-4b4e-97f0-04cc523320b9\\\",\\\"systemUUID\\\":\\\"4e5f6a1b-325c-4eb3-9961-e93f55b97b93\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T09:05:46Z is after 2025-08-24T17:21:41Z" Mar 18 09:05:46 crc kubenswrapper[4778]: E0318 09:05:46.686821 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:05:48 crc kubenswrapper[4778]: I0318 09:05:48.187048 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:48 crc kubenswrapper[4778]: I0318 09:05:48.187048 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:48 crc kubenswrapper[4778]: E0318 09:05:48.187829 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:48 crc kubenswrapper[4778]: I0318 09:05:48.187493 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:48 crc kubenswrapper[4778]: I0318 09:05:48.187287 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:48 crc kubenswrapper[4778]: E0318 09:05:48.187927 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:48 crc kubenswrapper[4778]: E0318 09:05:48.188067 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:48 crc kubenswrapper[4778]: E0318 09:05:48.188157 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:49 crc kubenswrapper[4778]: E0318 09:05:49.309109 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:50 crc kubenswrapper[4778]: I0318 09:05:50.186541 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:50 crc kubenswrapper[4778]: I0318 09:05:50.186608 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:50 crc kubenswrapper[4778]: I0318 09:05:50.186515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:50 crc kubenswrapper[4778]: I0318 09:05:50.186545 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:50 crc kubenswrapper[4778]: E0318 09:05:50.186743 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:50 crc kubenswrapper[4778]: E0318 09:05:50.186825 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:50 crc kubenswrapper[4778]: E0318 09:05:50.186903 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:50 crc kubenswrapper[4778]: E0318 09:05:50.187176 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:52 crc kubenswrapper[4778]: I0318 09:05:52.186421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:52 crc kubenswrapper[4778]: I0318 09:05:52.186508 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:52 crc kubenswrapper[4778]: I0318 09:05:52.186570 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:52 crc kubenswrapper[4778]: I0318 09:05:52.186421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:52 crc kubenswrapper[4778]: E0318 09:05:52.186630 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:52 crc kubenswrapper[4778]: E0318 09:05:52.186791 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:52 crc kubenswrapper[4778]: E0318 09:05:52.186894 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:52 crc kubenswrapper[4778]: E0318 09:05:52.186997 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.186530 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.186676 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:54 crc kubenswrapper[4778]: E0318 09:05:54.186845 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.186890 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.186902 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:54 crc kubenswrapper[4778]: E0318 09:05:54.187115 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:54 crc kubenswrapper[4778]: E0318 09:05:54.187371 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:54 crc kubenswrapper[4778]: E0318 09:05:54.187543 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.256283 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=137.256250315 podStartE2EDuration="2m17.256250315s" podCreationTimestamp="2026-03-18 09:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.238495157 +0000 UTC m=+220.813240057" watchObservedRunningTime="2026-03-18 09:05:54.256250315 +0000 UTC m=+220.830995155" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.256660 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.256653647 podStartE2EDuration="1m11.256653647s" podCreationTimestamp="2026-03-18 09:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.255789263 +0000 UTC m=+220.830534123" watchObservedRunningTime="2026-03-18 09:05:54.256653647 +0000 UTC m=+220.831398487" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.278025 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=103.277991102 podStartE2EDuration="1m43.277991102s" podCreationTimestamp="2026-03-18 09:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.276566134 +0000 UTC m=+220.851311034" watchObservedRunningTime="2026-03-18 09:05:54.277991102 +0000 UTC m=+220.852735982" Mar 18 09:05:54 crc kubenswrapper[4778]: E0318 09:05:54.310233 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.340329 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podStartSLOduration=150.340303774 podStartE2EDuration="2m30.340303774s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.33978112 +0000 UTC m=+220.914525970" watchObservedRunningTime="2026-03-18 09:05:54.340303774 +0000 UTC m=+220.915048614" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.448641 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r2lvf" podStartSLOduration=150.448613287 podStartE2EDuration="2m30.448613287s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.44833349 +0000 UTC m=+221.023078350" watchObservedRunningTime="2026-03-18 09:05:54.448613287 +0000 UTC m=+221.023358117" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.487901 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=65.487874287 podStartE2EDuration="1m5.487874287s" podCreationTimestamp="2026-03-18 09:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.487591609 +0000 UTC m=+221.062336469" watchObservedRunningTime="2026-03-18 09:05:54.487874287 +0000 UTC m=+221.062619127" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.510850 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=139.510823606 podStartE2EDuration="2m19.510823606s" podCreationTimestamp="2026-03-18 09:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.510575519 +0000 UTC m=+221.085320369" watchObservedRunningTime="2026-03-18 09:05:54.510823606 +0000 UTC m=+221.085568446" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.558340 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dfnnp" podStartSLOduration=150.558311607 podStartE2EDuration="2m30.558311607s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.536443347 +0000 UTC m=+221.111188207" watchObservedRunningTime="2026-03-18 09:05:54.558311607 +0000 UTC m=+221.133056497" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.570032 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xkfx8" podStartSLOduration=150.570014863 podStartE2EDuration="2m30.570014863s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.556943851 +0000 UTC m=+221.131688701" watchObservedRunningTime="2026-03-18 09:05:54.570014863 +0000 UTC m=+221.144759703" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.583763 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9f2bp" podStartSLOduration=150.583736403 podStartE2EDuration="2m30.583736403s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.569650763 +0000 UTC m=+221.144395613" watchObservedRunningTime="2026-03-18 09:05:54.583736403 +0000 UTC m=+221.158481233" Mar 18 09:05:54 crc kubenswrapper[4778]: I0318 09:05:54.584292 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7262f" podStartSLOduration=150.584285969 podStartE2EDuration="2m30.584285969s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:54.583667472 +0000 UTC m=+221.158412322" watchObservedRunningTime="2026-03-18 09:05:54.584285969 +0000 UTC m=+221.159030809" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.186790 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.186781 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.186898 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.186954 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:56 crc kubenswrapper[4778]: E0318 09:05:56.187062 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:56 crc kubenswrapper[4778]: E0318 09:05:56.187233 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:56 crc kubenswrapper[4778]: E0318 09:05:56.187712 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:56 crc kubenswrapper[4778]: E0318 09:05:56.187945 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.188134 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:05:56 crc kubenswrapper[4778]: E0318 09:05:56.188370 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-g2qth_openshift-ovn-kubernetes(ef97d63e-1caf-44c9-ac0c-9b03dbd05113)\"" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.996470 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.996539 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.996559 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.996585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 09:05:56 crc kubenswrapper[4778]: I0318 09:05:56.996605 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T09:05:56Z","lastTransitionTime":"2026-03-18T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.055997 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6"] Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.056751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.059751 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.059989 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.060076 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.061105 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.191478 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.191594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c1bc08-d789-4555-b6b7-6c162b9d8158-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.191883 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19c1bc08-d789-4555-b6b7-6c162b9d8158-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.191957 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19c1bc08-d789-4555-b6b7-6c162b9d8158-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.192056 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.230390 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.241419 4778 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293725 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293824 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c1bc08-d789-4555-b6b7-6c162b9d8158-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19c1bc08-d789-4555-b6b7-6c162b9d8158-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293876 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19c1bc08-d789-4555-b6b7-6c162b9d8158-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293910 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293928 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.293977 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/19c1bc08-d789-4555-b6b7-6c162b9d8158-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.295361 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c1bc08-d789-4555-b6b7-6c162b9d8158-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.303773 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19c1bc08-d789-4555-b6b7-6c162b9d8158-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.324799 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19c1bc08-d789-4555-b6b7-6c162b9d8158-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5vxv6\" (UID: \"19c1bc08-d789-4555-b6b7-6c162b9d8158\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: I0318 09:05:57.371442 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" Mar 18 09:05:57 crc kubenswrapper[4778]: W0318 09:05:57.390165 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19c1bc08_d789_4555_b6b7_6c162b9d8158.slice/crio-b0efafae09bcbd40d186b756b841a4bd6bae22bea62d8dcd3bf1bde537833265 WatchSource:0}: Error finding container b0efafae09bcbd40d186b756b841a4bd6bae22bea62d8dcd3bf1bde537833265: Status 404 returned error can't find the container with id b0efafae09bcbd40d186b756b841a4bd6bae22bea62d8dcd3bf1bde537833265 Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.187190 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.187359 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:05:58 crc kubenswrapper[4778]: E0318 09:05:58.187403 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.187462 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.187190 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:05:58 crc kubenswrapper[4778]: E0318 09:05:58.187604 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:05:58 crc kubenswrapper[4778]: E0318 09:05:58.187713 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:05:58 crc kubenswrapper[4778]: E0318 09:05:58.187787 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.306760 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" event={"ID":"19c1bc08-d789-4555-b6b7-6c162b9d8158","Type":"ContainerStarted","Data":"038fea5fb3390926dc22b9d4f252283c51a9262d5aa1d283947fe584b0a13ffc"} Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.306821 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" event={"ID":"19c1bc08-d789-4555-b6b7-6c162b9d8158","Type":"ContainerStarted","Data":"b0efafae09bcbd40d186b756b841a4bd6bae22bea62d8dcd3bf1bde537833265"} Mar 18 09:05:58 crc kubenswrapper[4778]: I0318 09:05:58.330545 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5vxv6" podStartSLOduration=154.330514799 podStartE2EDuration="2m34.330514799s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:05:58.330482118 +0000 UTC m=+224.905226968" watchObservedRunningTime="2026-03-18 09:05:58.330514799 +0000 UTC m=+224.905259639" Mar 18 09:05:59 crc kubenswrapper[4778]: E0318 09:05:59.311188 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.187081 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.187089 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.187153 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.187285 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:00 crc kubenswrapper[4778]: E0318 09:06:00.187505 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:00 crc kubenswrapper[4778]: E0318 09:06:00.187629 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:00 crc kubenswrapper[4778]: E0318 09:06:00.187814 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:00 crc kubenswrapper[4778]: E0318 09:06:00.187952 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.317094 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/1.log" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.318425 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/0.log" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.318507 4778 generic.go:334] "Generic (PLEG): container finished" podID="dce973f3-25e6-4536-87cc-9b46499ad7cf" containerID="3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562" exitCode=1 Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.318552 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerDied","Data":"3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562"} Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.318612 4778 scope.go:117] "RemoveContainer" containerID="27b0e16cef958ce0b8dc3b6b1591ad066afdba4c2d05701eda4060f110724ca5" Mar 18 09:06:00 crc kubenswrapper[4778]: I0318 09:06:00.319340 4778 scope.go:117] "RemoveContainer" containerID="3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562" Mar 18 09:06:00 crc kubenswrapper[4778]: E0318 09:06:00.319669 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf)\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:06:01 crc kubenswrapper[4778]: I0318 09:06:01.323994 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/1.log" Mar 18 09:06:02 crc kubenswrapper[4778]: I0318 09:06:02.186694 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:02 crc kubenswrapper[4778]: I0318 09:06:02.186848 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:02 crc kubenswrapper[4778]: I0318 09:06:02.186949 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:02 crc kubenswrapper[4778]: E0318 09:06:02.186943 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:02 crc kubenswrapper[4778]: I0318 09:06:02.187008 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:02 crc kubenswrapper[4778]: E0318 09:06:02.187079 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:02 crc kubenswrapper[4778]: E0318 09:06:02.187297 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:02 crc kubenswrapper[4778]: E0318 09:06:02.187408 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:04 crc kubenswrapper[4778]: I0318 09:06:04.187286 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:04 crc kubenswrapper[4778]: I0318 09:06:04.187304 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:04 crc kubenswrapper[4778]: I0318 09:06:04.187439 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:04 crc kubenswrapper[4778]: E0318 09:06:04.188514 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:04 crc kubenswrapper[4778]: I0318 09:06:04.188535 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:04 crc kubenswrapper[4778]: E0318 09:06:04.188677 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:04 crc kubenswrapper[4778]: E0318 09:06:04.188678 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:04 crc kubenswrapper[4778]: E0318 09:06:04.188747 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:04 crc kubenswrapper[4778]: E0318 09:06:04.312107 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:06:06 crc kubenswrapper[4778]: I0318 09:06:06.186607 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:06 crc kubenswrapper[4778]: E0318 09:06:06.186769 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:06 crc kubenswrapper[4778]: I0318 09:06:06.186894 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:06 crc kubenswrapper[4778]: I0318 09:06:06.187041 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:06 crc kubenswrapper[4778]: E0318 09:06:06.187124 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:06 crc kubenswrapper[4778]: I0318 09:06:06.187047 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:06 crc kubenswrapper[4778]: E0318 09:06:06.187231 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:06 crc kubenswrapper[4778]: E0318 09:06:06.187399 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.187044 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.187215 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:08 crc kubenswrapper[4778]: E0318 09:06:08.187328 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.187592 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.187620 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.188225 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:06:08 crc kubenswrapper[4778]: E0318 09:06:08.188586 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:08 crc kubenswrapper[4778]: E0318 09:06:08.188641 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:08 crc kubenswrapper[4778]: E0318 09:06:08.188812 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.351889 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.357077 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerStarted","Data":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.357611 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:06:08 crc kubenswrapper[4778]: I0318 09:06:08.392962 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podStartSLOduration=164.392934417 podStartE2EDuration="2m44.392934417s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:08.391641641 +0000 UTC m=+234.966386531" watchObservedRunningTime="2026-03-18 09:06:08.392934417 +0000 UTC m=+234.967679297" Mar 18 09:06:09 crc kubenswrapper[4778]: I0318 09:06:09.304992 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9bc7s"] Mar 18 09:06:09 crc kubenswrapper[4778]: I0318 09:06:09.305096 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:09 crc kubenswrapper[4778]: E0318 09:06:09.305215 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:09 crc kubenswrapper[4778]: E0318 09:06:09.313779 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:06:10 crc kubenswrapper[4778]: I0318 09:06:10.186780 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:10 crc kubenswrapper[4778]: E0318 09:06:10.187301 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:10 crc kubenswrapper[4778]: I0318 09:06:10.186907 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:10 crc kubenswrapper[4778]: E0318 09:06:10.187400 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:10 crc kubenswrapper[4778]: I0318 09:06:10.186852 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:10 crc kubenswrapper[4778]: E0318 09:06:10.187459 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:11 crc kubenswrapper[4778]: I0318 09:06:11.187158 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:11 crc kubenswrapper[4778]: E0318 09:06:11.187388 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:12 crc kubenswrapper[4778]: I0318 09:06:12.187010 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:12 crc kubenswrapper[4778]: I0318 09:06:12.187123 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:12 crc kubenswrapper[4778]: E0318 09:06:12.187801 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:12 crc kubenswrapper[4778]: E0318 09:06:12.187589 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:12 crc kubenswrapper[4778]: I0318 09:06:12.187166 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:12 crc kubenswrapper[4778]: E0318 09:06:12.187919 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:13 crc kubenswrapper[4778]: I0318 09:06:13.187165 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:13 crc kubenswrapper[4778]: I0318 09:06:13.187822 4778 scope.go:117] "RemoveContainer" containerID="3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562" Mar 18 09:06:13 crc kubenswrapper[4778]: E0318 09:06:13.188475 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:13 crc kubenswrapper[4778]: I0318 09:06:13.380754 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/1.log" Mar 18 09:06:13 crc kubenswrapper[4778]: I0318 09:06:13.380863 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerStarted","Data":"f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0"} Mar 18 09:06:14 crc kubenswrapper[4778]: I0318 09:06:14.186879 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:14 crc kubenswrapper[4778]: I0318 09:06:14.189369 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:14 crc kubenswrapper[4778]: I0318 09:06:14.189550 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:14 crc kubenswrapper[4778]: E0318 09:06:14.189651 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:14 crc kubenswrapper[4778]: E0318 09:06:14.190554 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:14 crc kubenswrapper[4778]: E0318 09:06:14.193035 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:14 crc kubenswrapper[4778]: E0318 09:06:14.314661 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:06:15 crc kubenswrapper[4778]: I0318 09:06:15.186138 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:15 crc kubenswrapper[4778]: E0318 09:06:15.186409 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:16 crc kubenswrapper[4778]: I0318 09:06:16.186640 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:16 crc kubenswrapper[4778]: I0318 09:06:16.186884 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:16 crc kubenswrapper[4778]: I0318 09:06:16.186905 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:16 crc kubenswrapper[4778]: E0318 09:06:16.187239 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:16 crc kubenswrapper[4778]: E0318 09:06:16.187064 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:16 crc kubenswrapper[4778]: E0318 09:06:16.187416 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:17 crc kubenswrapper[4778]: I0318 09:06:17.186277 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:17 crc kubenswrapper[4778]: E0318 09:06:17.186511 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:18 crc kubenswrapper[4778]: I0318 09:06:18.186228 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:18 crc kubenswrapper[4778]: I0318 09:06:18.186270 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:18 crc kubenswrapper[4778]: E0318 09:06:18.186493 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 09:06:18 crc kubenswrapper[4778]: E0318 09:06:18.186682 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 09:06:18 crc kubenswrapper[4778]: I0318 09:06:18.186973 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:18 crc kubenswrapper[4778]: E0318 09:06:18.187117 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 09:06:19 crc kubenswrapper[4778]: I0318 09:06:19.186725 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:19 crc kubenswrapper[4778]: E0318 09:06:19.187007 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bc7s" podUID="a2d5c312-2314-46d7-8ba2-64b621b0c2c7" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.186519 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.186584 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.186839 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.190146 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.190693 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.191422 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.191531 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 09:06:20 crc kubenswrapper[4778]: I0318 09:06:20.664952 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:20 crc kubenswrapper[4778]: E0318 09:06:20.665135 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:06:20 crc kubenswrapper[4778]: E0318 09:06:20.665275 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs podName:a2d5c312-2314-46d7-8ba2-64b621b0c2c7 nodeName:}" failed. No retries permitted until 2026-03-18 09:08:22.665184512 +0000 UTC m=+369.239929372 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs") pod "network-metrics-daemon-9bc7s" (UID: "a2d5c312-2314-46d7-8ba2-64b621b0c2c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 09:06:21 crc kubenswrapper[4778]: I0318 09:06:21.186402 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:06:21 crc kubenswrapper[4778]: I0318 09:06:21.189961 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 09:06:21 crc kubenswrapper[4778]: I0318 09:06:21.190648 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.253915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.299554 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.300283 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.300539 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.300731 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.301514 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.302139 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.303437 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.304468 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.305175 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bhmq7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.305808 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.307178 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q7qs8"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.307726 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.315505 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.315800 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.316041 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.316348 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.316521 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.316380 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.320356 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.320526 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.320532 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.320594 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.321370 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.321403 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.322646 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.322795 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.323088 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.325185 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.326715 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.327251 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ckp9s"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.327463 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.327685 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.328704 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.329872 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.347246 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.347710 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.348684 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.366700 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.366985 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.369981 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.370294 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.370357 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.370582 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.370799 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.371024 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.371167 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.371525 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.372741 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.372994 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.373047 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.373171 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.373297 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.373641 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.374296 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.374460 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.374815 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.375377 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.377575 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.377633 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.377756 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.377901 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.377597 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.382133 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.382866 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.405679 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.405905 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.406031 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.406213 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.406314 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.408629 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.408903 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.408990 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.409372 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.410281 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.410528 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.410887 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.410992 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.411069 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.412022 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.414431 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mlr7l"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.414974 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.415297 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.415410 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.415680 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.415954 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.417168 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.418538 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.420884 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.422630 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423083 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423153 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423229 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423294 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423372 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.423712 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.424086 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.424276 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.438556 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-nnfvg"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.439548 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.439969 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.440403 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.440579 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.440853 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.444756 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.445300 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.447757 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.471661 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.472064 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.472938 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.473603 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w2lf2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.473914 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.474937 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.475006 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.475101 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.475520 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.476084 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.476394 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.476616 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.477688 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.477894 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478176 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-smtz9"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478379 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0167d9e-5565-4154-80bb-3856d9b5985f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478414 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-image-import-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478437 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478454 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478468 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qw9\" (UniqueName: \"kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478485 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdf835c-58e4-4297-a247-690f407af22d-serving-cert\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478524 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-serving-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478537 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478563 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-node-pullsecrets\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478578 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-serving-cert\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478618 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zpgn\" (UniqueName: \"kubernetes.io/projected/a2907797-7fb3-44c0-81cf-783512fd1bf6-kube-api-access-4zpgn\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478636 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-encryption-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478671 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478684 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5flzf\" (UniqueName: \"kubernetes.io/projected/6e93d5ac-22fb-4d53-86c4-3262993f2116-kube-api-access-5flzf\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478700 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5h99\" (UniqueName: \"kubernetes.io/projected/6cdf835c-58e4-4297-a247-690f407af22d-kube-api-access-n5h99\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478716 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478731 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit-dir\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478746 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-dir\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478762 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-policies\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478779 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-config\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478801 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478819 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478823 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478836 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478851 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llblr\" (UniqueName: \"kubernetes.io/projected/918ba01d-c786-4f9a-ae58-5bcc23684c16-kube-api-access-llblr\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478867 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478883 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-serving-cert\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478901 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478917 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478941 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc9zp\" (UniqueName: \"kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478957 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kd5b\" (UniqueName: \"kubernetes.io/projected/034ec244-f99c-4c50-a55a-9b33b8b376c3-kube-api-access-9kd5b\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478979 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p7td\" (UniqueName: \"kubernetes.io/projected/35cf99cc-0bae-4b8d-b861-103e3174f081-kube-api-access-5p7td\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478995 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479010 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2907797-7fb3-44c0-81cf-783512fd1bf6-serving-cert\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479027 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-config\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479040 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0167d9e-5565-4154-80bb-3856d9b5985f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479057 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479072 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479087 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479114 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0167d9e-5565-4154-80bb-3856d9b5985f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479129 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479144 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034ec244-f99c-4c50-a55a-9b33b8b376c3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479158 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-client\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479176 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-service-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479211 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-client\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479228 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6cdf835c-58e4-4297-a247-690f407af22d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479243 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-encryption-config\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479261 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479277 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsq4q\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479293 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918ba01d-c786-4f9a-ae58-5bcc23684c16-serving-cert\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479325 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-trusted-ca\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479343 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034ec244-f99c-4c50-a55a-9b33b8b376c3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479361 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.478747 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4"] Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.479729 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:27.979717959 +0000 UTC m=+254.554462799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.479967 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.480304 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.480412 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.480690 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.480769 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.481100 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.481108 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.481430 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.481789 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.481918 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482432 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482569 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482611 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482692 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482774 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.482911 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.483111 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.483226 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.483296 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.483451 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484237 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484349 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484476 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484636 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484814 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484828 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.484930 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.485327 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gkpf4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.485659 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.486000 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.486589 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.486931 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.487260 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-tnw27"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.487523 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.487580 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.487732 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.487763 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.488296 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.488420 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.490327 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.491538 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.491267 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.492274 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-57msj"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.492665 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.493039 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.493226 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.493679 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.496355 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563744-btdt7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.497005 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6frtc"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.497932 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.498119 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.499310 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.499584 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.500389 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.503061 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.503328 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.509514 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563746-b66f7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.511988 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.521532 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.521907 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.523972 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.523996 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.524007 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bhmq7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.524056 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.524071 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.526243 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.528105 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.528693 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wgbcp"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.531544 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5gdpq"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.531800 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.532990 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.533061 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.533704 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.535389 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.537081 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.538307 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.539533 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.540945 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.542318 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.542653 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.544037 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gkpf4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.545470 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.547061 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.548671 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.550148 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ckp9s"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.551496 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qqzxx"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.552382 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.553190 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jmmm2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.554045 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.554899 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tnw27"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.556775 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.558037 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q7qs8"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.559271 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-btdt7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.560226 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.561247 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.562236 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jmmm2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.562669 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.563418 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mlr7l"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.564469 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-smtz9"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.565870 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.567299 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.568641 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w2lf2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.569736 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.570783 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.571888 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-57msj"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.572930 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6frtc"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.574038 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5gdpq"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.575173 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.576041 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.578181 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-b66f7"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.580266 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.580627 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc9zp\" (UniqueName: \"kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.580678 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kd5b\" (UniqueName: \"kubernetes.io/projected/034ec244-f99c-4c50-a55a-9b33b8b376c3-kube-api-access-9kd5b\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.580710 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p7td\" (UniqueName: \"kubernetes.io/projected/35cf99cc-0bae-4b8d-b861-103e3174f081-kube-api-access-5p7td\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.580736 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.080706124 +0000 UTC m=+254.655450964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.580805 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.581022 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2907797-7fb3-44c0-81cf-783512fd1bf6-serving-cert\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.581233 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-config\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.581273 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0167d9e-5565-4154-80bb-3856d9b5985f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.582090 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.582505 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.582571 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgbcp"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.582570 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-config\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.582672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.583010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.583173 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.584039 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.584541 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.584770 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.584954 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0167d9e-5565-4154-80bb-3856d9b5985f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.585104 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-client\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.585233 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.585352 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034ec244-f99c-4c50-a55a-9b33b8b376c3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.586979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-service-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.587502 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-client\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.587573 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6cdf835c-58e4-4297-a247-690f407af22d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.587611 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-encryption-config\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.588575 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6cdf835c-58e4-4297-a247-690f407af22d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.588662 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918ba01d-c786-4f9a-ae58-5bcc23684c16-serving-cert\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.588728 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-trusted-ca\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.588786 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.588831 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsq4q\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.589175 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.589258 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034ec244-f99c-4c50-a55a-9b33b8b376c3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.589312 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.589377 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0167d9e-5565-4154-80bb-3856d9b5985f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.589631 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4"] Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.590948 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.591634 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592086 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-client\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.591965 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/918ba01d-c786-4f9a-ae58-5bcc23684c16-trusted-ca\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592389 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2907797-7fb3-44c0-81cf-783512fd1bf6-serving-cert\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592469 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-image-import-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592725 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592864 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdf835c-58e4-4297-a247-690f407af22d-serving-cert\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592916 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.592955 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593131 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0167d9e-5565-4154-80bb-3856d9b5985f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593268 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034ec244-f99c-4c50-a55a-9b33b8b376c3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593278 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qw9\" (UniqueName: \"kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593321 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593609 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-image-import-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593902 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918ba01d-c786-4f9a-ae58-5bcc23684c16-serving-cert\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.593970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0167d9e-5565-4154-80bb-3856d9b5985f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594301 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594364 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-serving-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594439 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-service-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594463 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-node-pullsecrets\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594492 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-serving-cert\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.594790 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.595132 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-serving-ca\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.595270 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-etcd-client\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.595747 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596041 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034ec244-f99c-4c50-a55a-9b33b8b376c3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596132 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596262 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-node-pullsecrets\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596390 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596558 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zpgn\" (UniqueName: \"kubernetes.io/projected/a2907797-7fb3-44c0-81cf-783512fd1bf6-kube-api-access-4zpgn\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596644 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-encryption-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596699 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5h99\" (UniqueName: \"kubernetes.io/projected/6cdf835c-58e4-4297-a247-690f407af22d-kube-api-access-n5h99\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596791 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596907 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5flzf\" (UniqueName: \"kubernetes.io/projected/6e93d5ac-22fb-4d53-86c4-3262993f2116-kube-api-access-5flzf\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.596963 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597054 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit-dir\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597098 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-dir\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597172 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-policies\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597268 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-config\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597347 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597400 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597435 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597490 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llblr\" (UniqueName: \"kubernetes.io/projected/918ba01d-c786-4f9a-ae58-5bcc23684c16-kube-api-access-llblr\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597779 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-serving-cert\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597830 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-encryption-config\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597852 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597917 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.597936 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.598146 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.598772 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.599082 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.098971598 +0000 UTC m=+254.673716458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599095 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-dir\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599450 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdf835c-58e4-4297-a247-690f407af22d-serving-cert\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599520 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e93d5ac-22fb-4d53-86c4-3262993f2116-audit-dir\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599802 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.599986 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.600184 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-config\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.600748 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35cf99cc-0bae-4b8d-b861-103e3174f081-audit-policies\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.600961 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2907797-7fb3-44c0-81cf-783512fd1bf6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.601961 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.602087 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-serving-cert\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.602820 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.604290 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e93d5ac-22fb-4d53-86c4-3262993f2116-encryption-config\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.604494 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.604967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35cf99cc-0bae-4b8d-b861-103e3174f081-serving-cert\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.605656 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.623734 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.653892 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.663350 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.682810 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698442 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.698611 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.198576475 +0000 UTC m=+254.773321315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698671 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55w8z\" (UniqueName: \"kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z\") pod \"auto-csr-approver-29563746-b66f7\" (UID: \"c3be356e-94af-47db-a182-dd8a57024619\") " pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698718 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmtc\" (UniqueName: \"kubernetes.io/projected/6d6ab3a6-da16-4fc8-9235-2c223661de30-kube-api-access-swmtc\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698746 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c851677f-703c-404c-801c-064cc6bf3979-proxy-tls\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698773 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698837 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698868 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698909 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef74c17c-eb2a-4bef-b948-6b06efd76719-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698954 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-serving-cert\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698975 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4jz\" (UniqueName: \"kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.698993 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699010 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-srv-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699031 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjwps\" (UniqueName: \"kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps\") pod \"auto-csr-approver-29563744-btdt7\" (UID: \"54961f10-93b0-433f-8a7d-b30d69178e9a\") " pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699050 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699067 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699096 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztqc4\" (UniqueName: \"kubernetes.io/projected/f06790e0-cf8c-48f0-8d48-893663fdbd1c-kube-api-access-ztqc4\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7467j\" (UniqueName: \"kubernetes.io/projected/0a99ad6c-7819-4b33-8846-26e6ede5ce22-kube-api-access-7467j\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699138 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699158 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699176 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5gzz\" (UniqueName: \"kubernetes.io/projected/46167450-7100-4ac9-a9dd-e678eb3d8677-kube-api-access-b5gzz\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699207 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-config\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699224 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80d2d01c-2b8d-49ff-adad-6b49568293a0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699243 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-default-certificate\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699259 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699276 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699293 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0fd619-d1c2-45e7-a7cf-e784b082428f-trusted-ca\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699329 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699348 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699392 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b244100-6e88-4ba2-b656-83b6e31d23c8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699451 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxpv5\" (UniqueName: \"kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699486 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-service-ca-bundle\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699504 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699523 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699546 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-images\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699606 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699635 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r5f7\" (UniqueName: \"kubernetes.io/projected/80d2d01c-2b8d-49ff-adad-6b49568293a0-kube-api-access-8r5f7\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699659 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699682 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699700 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qtbr\" (UniqueName: \"kubernetes.io/projected/ba84f396-0169-4d5e-a126-60ac9d6d49f8-kube-api-access-7qtbr\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699716 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699741 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699760 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-machine-approver-tls\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.699815 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-auth-proxy-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700111 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700186 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-config\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700241 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjfwr\" (UniqueName: \"kubernetes.io/projected/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-kube-api-access-xjfwr\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700353 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700402 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700453 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700484 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq5ms\" (UniqueName: \"kubernetes.io/projected/d4d183f7-2762-458d-83f1-a8894c00bb82-kube-api-access-sq5ms\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700548 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6d6ab3a6-da16-4fc8-9235-2c223661de30-tmpfs\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700589 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-metrics-certs\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700617 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92ctw\" (UniqueName: \"kubernetes.io/projected/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-kube-api-access-92ctw\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700650 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-client\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700704 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700766 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-service-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700799 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700829 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fgmb\" (UniqueName: \"kubernetes.io/projected/ef74c17c-eb2a-4bef-b948-6b06efd76719-kube-api-access-2fgmb\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700871 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njhmj\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-kube-api-access-njhmj\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700901 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcvjk\" (UniqueName: \"kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700929 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5n2d\" (UniqueName: \"kubernetes.io/projected/08b964cf-bfc5-4b90-83a3-0b358c3ffbc9-kube-api-access-p5n2d\") pod \"downloads-7954f5f757-tnw27\" (UID: \"08b964cf-bfc5-4b90-83a3-0b358c3ffbc9\") " pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.700978 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c0fd619-d1c2-45e7-a7cf-e784b082428f-metrics-tls\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701014 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c851677f-703c-404c-801c-064cc6bf3979-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701037 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701057 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701079 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701102 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcgvq\" (UniqueName: \"kubernetes.io/projected/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-kube-api-access-jcgvq\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701143 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701171 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba84f396-0169-4d5e-a126-60ac9d6d49f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701213 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-stats-auth\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701240 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701260 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9n22\" (UniqueName: \"kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701280 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f06790e0-cf8c-48f0-8d48-893663fdbd1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701300 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbvm\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-kube-api-access-vjbvm\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701329 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f5nq\" (UniqueName: \"kubernetes.io/projected/c0f3c490-ee49-4a88-893e-132592dd6d59-kube-api-access-7f5nq\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701352 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef74c17c-eb2a-4bef-b948-6b06efd76719-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701394 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt67h\" (UniqueName: \"kubernetes.io/projected/e24d15f2-56e5-4fcc-91ab-370d7b4fb41e-kube-api-access-dt67h\") pod \"migrator-59844c95c7-jbw52\" (UID: \"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701411 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701429 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b244100-6e88-4ba2-b656-83b6e31d23c8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.701450 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4699k\" (UniqueName: \"kubernetes.io/projected/c851677f-703c-404c-801c-064cc6bf3979-kube-api-access-4699k\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.701957 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.201941736 +0000 UTC m=+254.776686566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.703878 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.723112 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.743114 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.763582 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.783657 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.801993 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.802266 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.302186261 +0000 UTC m=+254.876931101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802339 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztqc4\" (UniqueName: \"kubernetes.io/projected/f06790e0-cf8c-48f0-8d48-893663fdbd1c-kube-api-access-ztqc4\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802408 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7467j\" (UniqueName: \"kubernetes.io/projected/0a99ad6c-7819-4b33-8846-26e6ede5ce22-kube-api-access-7467j\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802476 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802518 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gzz\" (UniqueName: \"kubernetes.io/projected/46167450-7100-4ac9-a9dd-e678eb3d8677-kube-api-access-b5gzz\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802548 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-config\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802617 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80d2d01c-2b8d-49ff-adad-6b49568293a0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802654 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-default-certificate\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802685 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802717 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802752 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0fd619-d1c2-45e7-a7cf-e784b082428f-trusted-ca\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802783 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802815 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802848 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802878 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b244100-6e88-4ba2-b656-83b6e31d23c8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802921 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802950 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxpv5\" (UniqueName: \"kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.802979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-service-ca-bundle\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803011 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803037 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803074 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-images\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803181 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r5f7\" (UniqueName: \"kubernetes.io/projected/80d2d01c-2b8d-49ff-adad-6b49568293a0-kube-api-access-8r5f7\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803258 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803271 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803284 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803325 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803363 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qtbr\" (UniqueName: \"kubernetes.io/projected/ba84f396-0169-4d5e-a126-60ac9d6d49f8-kube-api-access-7qtbr\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803395 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803436 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803465 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-machine-approver-tls\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803494 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803525 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-auth-proxy-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803564 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-config\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803588 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjfwr\" (UniqueName: \"kubernetes.io/projected/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-kube-api-access-xjfwr\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803625 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803653 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803716 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq5ms\" (UniqueName: \"kubernetes.io/projected/d4d183f7-2762-458d-83f1-a8894c00bb82-kube-api-access-sq5ms\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803755 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92ctw\" (UniqueName: \"kubernetes.io/projected/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-kube-api-access-92ctw\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803782 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6d6ab3a6-da16-4fc8-9235-2c223661de30-tmpfs\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803808 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-metrics-certs\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803845 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-client\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803869 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-service-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.803983 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804017 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804032 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fgmb\" (UniqueName: \"kubernetes.io/projected/ef74c17c-eb2a-4bef-b948-6b06efd76719-kube-api-access-2fgmb\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804074 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njhmj\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-kube-api-access-njhmj\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804109 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5n2d\" (UniqueName: \"kubernetes.io/projected/08b964cf-bfc5-4b90-83a3-0b358c3ffbc9-kube-api-access-p5n2d\") pod \"downloads-7954f5f757-tnw27\" (UID: \"08b964cf-bfc5-4b90-83a3-0b358c3ffbc9\") " pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804134 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcvjk\" (UniqueName: \"kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804188 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c0fd619-d1c2-45e7-a7cf-e784b082428f-metrics-tls\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804253 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c851677f-703c-404c-801c-064cc6bf3979-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804283 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804330 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804362 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804391 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804418 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcgvq\" (UniqueName: \"kubernetes.io/projected/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-kube-api-access-jcgvq\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804455 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-stats-auth\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804491 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba84f396-0169-4d5e-a126-60ac9d6d49f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804530 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f06790e0-cf8c-48f0-8d48-893663fdbd1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804553 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbvm\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-kube-api-access-vjbvm\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804579 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804599 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b244100-6e88-4ba2-b656-83b6e31d23c8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804604 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9n22\" (UniqueName: \"kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804640 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f5nq\" (UniqueName: \"kubernetes.io/projected/c0f3c490-ee49-4a88-893e-132592dd6d59-kube-api-access-7f5nq\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804710 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef74c17c-eb2a-4bef-b948-6b06efd76719-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804739 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt67h\" (UniqueName: \"kubernetes.io/projected/e24d15f2-56e5-4fcc-91ab-370d7b4fb41e-kube-api-access-dt67h\") pod \"migrator-59844c95c7-jbw52\" (UID: \"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804784 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4699k\" (UniqueName: \"kubernetes.io/projected/c851677f-703c-404c-801c-064cc6bf3979-kube-api-access-4699k\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804801 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804817 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b244100-6e88-4ba2-b656-83b6e31d23c8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804837 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55w8z\" (UniqueName: \"kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z\") pod \"auto-csr-approver-29563746-b66f7\" (UID: \"c3be356e-94af-47db-a182-dd8a57024619\") " pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804864 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmtc\" (UniqueName: \"kubernetes.io/projected/6d6ab3a6-da16-4fc8-9235-2c223661de30-kube-api-access-swmtc\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804885 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804904 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c851677f-703c-404c-801c-064cc6bf3979-proxy-tls\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804921 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804939 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804960 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef74c17c-eb2a-4bef-b948-6b06efd76719-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.804979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-serving-cert\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805007 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df4jz\" (UniqueName: \"kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805039 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805066 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjwps\" (UniqueName: \"kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps\") pod \"auto-csr-approver-29563744-btdt7\" (UID: \"54961f10-93b0-433f-8a7d-b30d69178e9a\") " pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-srv-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805126 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805642 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c851677f-703c-404c-801c-064cc6bf3979-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.805857 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.806042 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.306030746 +0000 UTC m=+254.880775826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.806049 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef74c17c-eb2a-4bef-b948-6b06efd76719-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.806372 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-service-ca-bundle\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.806964 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.807030 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.807081 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.807144 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.808097 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-config\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.809239 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.809779 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6d6ab3a6-da16-4fc8-9235-2c223661de30-tmpfs\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.809870 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.809913 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-service-ca\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.810699 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.811337 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.812970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.813368 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-etcd-client\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.813556 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.813598 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.813782 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.815335 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.815757 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-default-certificate\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.816054 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba84f396-0169-4d5e-a126-60ac9d6d49f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.816487 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.816975 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-metrics-certs\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.817334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-stats-auth\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.818385 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c851677f-703c-404c-801c-064cc6bf3979-proxy-tls\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.818862 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef74c17c-eb2a-4bef-b948-6b06efd76719-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.823617 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.826558 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46167450-7100-4ac9-a9dd-e678eb3d8677-serving-cert\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.845037 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.865314 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.867678 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-images\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.884896 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.902518 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.906187 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.906360 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.406328862 +0000 UTC m=+254.981073742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.907006 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:27 crc kubenswrapper[4778]: E0318 09:06:27.907416 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.407405311 +0000 UTC m=+254.982150171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.909909 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f06790e0-cf8c-48f0-8d48-893663fdbd1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.922327 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.930377 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06790e0-cf8c-48f0-8d48-893663fdbd1c-config\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.943666 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.963467 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.984589 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 09:06:27 crc kubenswrapper[4778]: I0318 09:06:27.988957 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c0fd619-d1c2-45e7-a7cf-e784b082428f-metrics-tls\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.004074 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.008439 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.008681 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.508616093 +0000 UTC m=+255.083360973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.009271 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.009836 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.509797514 +0000 UTC m=+255.084542354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.033691 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.043602 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.045902 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c0fd619-d1c2-45e7-a7cf-e784b082428f-trusted-ca\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.063503 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.083151 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.103437 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.110589 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.111038 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.611002455 +0000 UTC m=+255.185747305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.111384 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.112091 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.612050064 +0000 UTC m=+255.186795094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.123123 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.143161 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.163903 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.183765 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.202863 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.212852 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.213183 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.712992918 +0000 UTC m=+255.287737798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.214152 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.214685 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.714665233 +0000 UTC m=+255.289410113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.225277 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.243897 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.265591 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.276549 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-machine-approver-tls\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.283757 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.303429 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.316177 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.316454 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.816375688 +0000 UTC m=+255.391120578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.317129 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.317863 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.817836067 +0000 UTC m=+255.392580947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.323175 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.343276 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.350654 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-auth-proxy-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.362900 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.370375 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-config\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.383920 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.404116 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.418865 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.419094 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.919061559 +0000 UTC m=+255.493806439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.420481 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.421113 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:28.921086284 +0000 UTC m=+255.495831164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.423375 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.443333 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.463813 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.482761 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.501886 4778 request.go:700] Waited for 1.014673143s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dconfig&limit=500&resourceVersion=0 Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.503774 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.522836 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.523129 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.023083427 +0000 UTC m=+255.597828317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.523541 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.524137 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.524231 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.024214677 +0000 UTC m=+255.598959517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.543706 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.551668 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b244100-6e88-4ba2-b656-83b6e31d23c8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.564515 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.583633 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.604417 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.623817 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.624381 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.624537 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.124509354 +0000 UTC m=+255.699254204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.624811 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.625171 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.125160221 +0000 UTC m=+255.699905071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.632564 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80d2d01c-2b8d-49ff-adad-6b49568293a0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.644253 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.663010 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.682908 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.690847 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-srv-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.703340 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.710606 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.710957 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d4d183f7-2762-458d-83f1-a8894c00bb82-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.723384 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.725821 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.726003 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.225964502 +0000 UTC m=+255.800709352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.726124 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.726557 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.226538797 +0000 UTC m=+255.801283637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.743469 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.763708 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.782848 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.803546 4778 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.803657 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert podName:c0f3c490-ee49-4a88-893e-132592dd6d59 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.303625228 +0000 UTC m=+255.878370078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert") pod "service-ca-operator-777779d784-57msj" (UID: "c0f3c490-ee49-4a88-893e-132592dd6d59") : failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.803985 4778 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.804037 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca podName:f25fe9ee-95f7-4a7c-98f1-7dabbd43527a nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.304022489 +0000 UTC m=+255.878767339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca") pod "marketplace-operator-79b997595-2hr48" (UID: "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a") : failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.804142 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806105 4778 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806167 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle podName:0a99ad6c-7819-4b33-8846-26e6ede5ce22 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.306153526 +0000 UTC m=+255.880898376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle") pod "service-ca-9c57cc56f-6frtc" (UID: "0a99ad6c-7819-4b33-8846-26e6ede5ce22") : failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806400 4778 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806544 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume podName:97ee6937-a1a5-42ea-a460-29d54478e633 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.306501196 +0000 UTC m=+255.881246136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume") pod "collect-profiles-29563740-wbvxl" (UID: "97ee6937-a1a5-42ea-a460-29d54478e633") : failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806626 4778 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806707 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config podName:c0f3c490-ee49-4a88-893e-132592dd6d59 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.306686551 +0000 UTC m=+255.881431681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config") pod "service-ca-operator-777779d784-57msj" (UID: "c0f3c490-ee49-4a88-893e-132592dd6d59") : failed to sync configmap cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806773 4778 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.806866 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert podName:6d6ab3a6-da16-4fc8-9235-2c223661de30 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.306846405 +0000 UTC m=+255.881591275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert") pod "packageserver-d55dfcdfc-5vjr4" (UID: "6d6ab3a6-da16-4fc8-9235-2c223661de30") : failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.808029 4778 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.808112 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert podName:6d6ab3a6-da16-4fc8-9235-2c223661de30 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.308092548 +0000 UTC m=+255.882837418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert") pod "packageserver-d55dfcdfc-5vjr4" (UID: "6d6ab3a6-da16-4fc8-9235-2c223661de30") : failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.809418 4778 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.809483 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert podName:bfcc4e0d-0910-42c0-bcac-44c6aee8b74d nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.309469666 +0000 UTC m=+255.884214506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert") pod "ingress-canary-jmmm2" (UID: "bfcc4e0d-0910-42c0-bcac-44c6aee8b74d") : failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.810171 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.811656 4778 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.811743 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key podName:0a99ad6c-7819-4b33-8846-26e6ede5ce22 nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.311722297 +0000 UTC m=+255.886467347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key") pod "service-ca-9c57cc56f-6frtc" (UID: "0a99ad6c-7819-4b33-8846-26e6ede5ce22") : failed to sync secret cache: timed out waiting for the condition Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.829008 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.829319 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.32927573 +0000 UTC m=+255.904020600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.830064 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.830545 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.330524493 +0000 UTC m=+255.905269343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.836893 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.843835 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.864571 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.882827 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.903718 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.924085 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.931826 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.932086 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.432054094 +0000 UTC m=+256.006798974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.932885 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:28 crc kubenswrapper[4778]: E0318 09:06:28.933286 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.433276547 +0000 UTC m=+256.008021387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.944684 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.963872 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 09:06:28 crc kubenswrapper[4778]: I0318 09:06:28.983955 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.003687 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.023776 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.034160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.034415 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.534361615 +0000 UTC m=+256.109106505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.034966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.035602 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.535581647 +0000 UTC m=+256.110326557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.044316 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.063360 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.083027 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.104101 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.123656 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.136877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.137135 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.637099718 +0000 UTC m=+256.211844588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.138116 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.138673 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.638648789 +0000 UTC m=+256.213393669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.144245 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.185139 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.203710 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.224441 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.239774 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.240093 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.740047135 +0000 UTC m=+256.314792025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.241171 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.241704 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.74168055 +0000 UTC m=+256.316425430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.244736 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.264622 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.284394 4778 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.303624 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.322971 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.342058 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.342248 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.842190413 +0000 UTC m=+256.416935273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.342372 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.342685 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.343483 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.343887 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.843875128 +0000 UTC m=+256.418619978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.343900 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.343907 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-cabundle\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.344758 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.344868 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.345375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.345608 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.345664 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.345738 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.345999 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.346057 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.348584 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.348704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f3c490-ee49-4a88-893e-132592dd6d59-config\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.350781 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-apiservice-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.357263 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f3c490-ee49-4a88-893e-132592dd6d59-serving-cert\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.359558 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d6ab3a6-da16-4fc8-9235-2c223661de30-webhook-cert\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.361524 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a99ad6c-7819-4b33-8846-26e6ede5ce22-signing-key\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.364062 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.384144 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.404263 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.412715 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-cert\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.423655 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.443584 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.446996 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.447190 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.947156135 +0000 UTC m=+256.521900985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.447327 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.447701 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:29.94768606 +0000 UTC m=+256.522430910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.492061 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p7td\" (UniqueName: \"kubernetes.io/projected/35cf99cc-0bae-4b8d-b861-103e3174f081-kube-api-access-5p7td\") pod \"apiserver-7bbb656c7d-7xphq\" (UID: \"35cf99cc-0bae-4b8d-b861-103e3174f081\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.501682 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc9zp\" (UniqueName: \"kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp\") pod \"route-controller-manager-6576b87f9c-dsqlz\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.520746 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kd5b\" (UniqueName: \"kubernetes.io/projected/034ec244-f99c-4c50-a55a-9b33b8b376c3-kube-api-access-9kd5b\") pod \"openshift-apiserver-operator-796bbdcf4f-xwmwl\" (UID: \"034ec244-f99c-4c50-a55a-9b33b8b376c3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.522051 4778 request.go:700] Waited for 1.934565618s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/serviceaccounts/openshift-kube-scheduler-operator/token Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.549091 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.549347 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.049305502 +0000 UTC m=+256.624050382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.549849 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.550615 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.050588446 +0000 UTC m=+256.625333396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.553664 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0167d9e-5565-4154-80bb-3856d9b5985f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v92lm\" (UID: \"d0167d9e-5565-4154-80bb-3856d9b5985f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.560783 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.580028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsq4q\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.606554 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.611867 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.616106 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qw9\" (UniqueName: \"kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9\") pod \"console-f9d7485db-pgsqh\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.621435 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5flzf\" (UniqueName: \"kubernetes.io/projected/6e93d5ac-22fb-4d53-86c4-3262993f2116-kube-api-access-5flzf\") pod \"apiserver-76f77b778f-ckp9s\" (UID: \"6e93d5ac-22fb-4d53-86c4-3262993f2116\") " pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.644947 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llblr\" (UniqueName: \"kubernetes.io/projected/918ba01d-c786-4f9a-ae58-5bcc23684c16-kube-api-access-llblr\") pod \"console-operator-58897d9998-q7qs8\" (UID: \"918ba01d-c786-4f9a-ae58-5bcc23684c16\") " pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.651046 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.651492 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.151455868 +0000 UTC m=+256.726200748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.651556 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.652816 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.152797625 +0000 UTC m=+256.727542495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.662753 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5h99\" (UniqueName: \"kubernetes.io/projected/6cdf835c-58e4-4297-a247-690f407af22d-kube-api-access-n5h99\") pod \"openshift-config-operator-7777fb866f-8f6hb\" (UID: \"6cdf835c-58e4-4297-a247-690f407af22d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.683243 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zpgn\" (UniqueName: \"kubernetes.io/projected/a2907797-7fb3-44c0-81cf-783512fd1bf6-kube-api-access-4zpgn\") pod \"authentication-operator-69f744f599-bhmq7\" (UID: \"a2907797-7fb3-44c0-81cf-783512fd1bf6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.720974 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztqc4\" (UniqueName: \"kubernetes.io/projected/f06790e0-cf8c-48f0-8d48-893663fdbd1c-kube-api-access-ztqc4\") pod \"machine-api-operator-5694c8668f-smtz9\" (UID: \"f06790e0-cf8c-48f0-8d48-893663fdbd1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.745148 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.748133 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7467j\" (UniqueName: \"kubernetes.io/projected/0a99ad6c-7819-4b33-8846-26e6ede5ce22-kube-api-access-7467j\") pod \"service-ca-9c57cc56f-6frtc\" (UID: \"0a99ad6c-7819-4b33-8846-26e6ede5ce22\") " pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.753651 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.753817 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.253790891 +0000 UTC m=+256.828535741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.754060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.754435 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.254418557 +0000 UTC m=+256.829163417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.769942 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gzz\" (UniqueName: \"kubernetes.io/projected/46167450-7100-4ac9-a9dd-e678eb3d8677-kube-api-access-b5gzz\") pod \"etcd-operator-b45778765-w2lf2\" (UID: \"46167450-7100-4ac9-a9dd-e678eb3d8677\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.772592 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.777574 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.786958 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.788330 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fgmb\" (UniqueName: \"kubernetes.io/projected/ef74c17c-eb2a-4bef-b948-6b06efd76719-kube-api-access-2fgmb\") pod \"openshift-controller-manager-operator-756b6f6bc6-ktkfs\" (UID: \"ef74c17c-eb2a-4bef-b948-6b06efd76719\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.796823 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.808298 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r5f7\" (UniqueName: \"kubernetes.io/projected/80d2d01c-2b8d-49ff-adad-6b49568293a0-kube-api-access-8r5f7\") pod \"multus-admission-controller-857f4d67dd-gkpf4\" (UID: \"80d2d01c-2b8d-49ff-adad-6b49568293a0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.809499 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.811619 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.819618 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9n22\" (UniqueName: \"kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22\") pod \"collect-profiles-29563740-wbvxl\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.838066 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.841924 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njhmj\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-kube-api-access-njhmj\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.844997 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.855265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.856525 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.356507023 +0000 UTC m=+256.931251863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.857367 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.859928 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5n2d\" (UniqueName: \"kubernetes.io/projected/08b964cf-bfc5-4b90-83a3-0b358c3ffbc9-kube-api-access-p5n2d\") pod \"downloads-7954f5f757-tnw27\" (UID: \"08b964cf-bfc5-4b90-83a3-0b358c3ffbc9\") " pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.874065 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.880447 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcvjk\" (UniqueName: \"kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk\") pod \"marketplace-operator-79b997595-2hr48\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.901496 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxpv5\" (UniqueName: \"kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5\") pod \"oauth-openshift-558db77b4-x5dpv\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.921469 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f5nq\" (UniqueName: \"kubernetes.io/projected/c0f3c490-ee49-4a88-893e-132592dd6d59-kube-api-access-7f5nq\") pod \"service-ca-operator-777779d784-57msj\" (UID: \"c0f3c490-ee49-4a88-893e-132592dd6d59\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.940010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c0fd619-d1c2-45e7-a7cf-e784b082428f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-t27k4\" (UID: \"6c0fd619-d1c2-45e7-a7cf-e784b082428f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.945946 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.959226 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:29 crc kubenswrapper[4778]: E0318 09:06:29.959698 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.459685817 +0000 UTC m=+257.034430657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.973755 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt67h\" (UniqueName: \"kubernetes.io/projected/e24d15f2-56e5-4fcc-91ab-370d7b4fb41e-kube-api-access-dt67h\") pod \"migrator-59844c95c7-jbw52\" (UID: \"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.976794 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.984978 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4699k\" (UniqueName: \"kubernetes.io/projected/c851677f-703c-404c-801c-064cc6bf3979-kube-api-access-4699k\") pod \"machine-config-controller-84d6567774-kfkmc\" (UID: \"c851677f-703c-404c-801c-064cc6bf3979\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:29 crc kubenswrapper[4778]: I0318 09:06:29.999728 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-smtz9"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.006977 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55w8z\" (UniqueName: \"kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z\") pod \"auto-csr-approver-29563746-b66f7\" (UID: \"c3be356e-94af-47db-a182-dd8a57024619\") " pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.020667 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmtc\" (UniqueName: \"kubernetes.io/projected/6d6ab3a6-da16-4fc8-9235-2c223661de30-kube-api-access-swmtc\") pod \"packageserver-d55dfcdfc-5vjr4\" (UID: \"6d6ab3a6-da16-4fc8-9235-2c223661de30\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.033685 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" Mar 18 09:06:30 crc kubenswrapper[4778]: W0318 09:06:30.038501 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf06790e0_cf8c_48f0_8d48_893663fdbd1c.slice/crio-afb06ad3e0bcd75590f17c823bc1aff440fe73faa5a7fd84f4912126237879bd WatchSource:0}: Error finding container afb06ad3e0bcd75590f17c823bc1aff440fe73faa5a7fd84f4912126237879bd: Status 404 returned error can't find the container with id afb06ad3e0bcd75590f17c823bc1aff440fe73faa5a7fd84f4912126237879bd Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.038958 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbvm\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-kube-api-access-vjbvm\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.052341 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.060366 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.060902 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.560882288 +0000 UTC m=+257.135627128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.065358 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df4jz\" (UniqueName: \"kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz\") pod \"controller-manager-879f6c89f-hvvlz\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.073124 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.085892 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjwps\" (UniqueName: \"kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps\") pod \"auto-csr-approver-29563744-btdt7\" (UID: \"54961f10-93b0-433f-8a7d-b30d69178e9a\") " pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.101988 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.103318 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qtbr\" (UniqueName: \"kubernetes.io/projected/ba84f396-0169-4d5e-a126-60ac9d6d49f8-kube-api-access-7qtbr\") pod \"control-plane-machine-set-operator-78cbb6b69f-qtggn\" (UID: \"ba84f396-0169-4d5e-a126-60ac9d6d49f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.118714 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.120002 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.120059 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.128846 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcgvq\" (UniqueName: \"kubernetes.io/projected/9b31b04d-28d1-4397-88b3-b26a4bb6ede9-kube-api-access-jcgvq\") pod \"router-default-5444994796-nnfvg\" (UID: \"9b31b04d-28d1-4397-88b3-b26a4bb6ede9\") " pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.129170 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.138028 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.146263 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq5ms\" (UniqueName: \"kubernetes.io/projected/d4d183f7-2762-458d-83f1-a8894c00bb82-kube-api-access-sq5ms\") pod \"olm-operator-6b444d44fb-8xcgz\" (UID: \"d4d183f7-2762-458d-83f1-a8894c00bb82\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.151423 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.151510 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.161758 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.162383 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.662371957 +0000 UTC m=+257.237116797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.164520 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.178785 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b244100-6e88-4ba2-b656-83b6e31d23c8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nd44t\" (UID: \"0b244100-6e88-4ba2-b656-83b6e31d23c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.179771 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjfwr\" (UniqueName: \"kubernetes.io/projected/8ed2be9e-a493-4ce8-aee1-83f3ae258fba-kube-api-access-xjfwr\") pod \"machine-approver-56656f9798-lbkl2\" (UID: \"8ed2be9e-a493-4ce8-aee1-83f3ae258fba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.181630 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.224214 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.241417 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.241767 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.252277 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.260150 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263095 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263312 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-metrics-tls\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263339 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx9dg\" (UniqueName: \"kubernetes.io/projected/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-kube-api-access-qx9dg\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263362 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm74g\" (UniqueName: \"kubernetes.io/projected/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-kube-api-access-tm74g\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263389 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-mountpoint-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263418 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd030212-5b03-4555-b885-388260b53588-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263434 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533b6e54-efa5-4032-bebd-eedc39a834b8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263468 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjnk\" (UniqueName: \"kubernetes.io/projected/de938bf1-1696-46c9-b6af-9a3766846e8d-kube-api-access-tsjnk\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263486 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b84z\" (UniqueName: \"kubernetes.io/projected/533b6e54-efa5-4032-bebd-eedc39a834b8-kube-api-access-8b84z\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263512 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-profile-collector-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263529 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/577d365f-ec95-4de4-a6a4-6752b2f0de56-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263544 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd030212-5b03-4555-b885-388260b53588-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263568 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/533b6e54-efa5-4032-bebd-eedc39a834b8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263587 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-registration-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-certs\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263618 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-node-bootstrap-token\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263634 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263651 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-proxy-tls\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263675 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqc9n\" (UniqueName: \"kubernetes.io/projected/577d365f-ec95-4de4-a6a4-6752b2f0de56-kube-api-access-pqc9n\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263693 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk4m9\" (UniqueName: \"kubernetes.io/projected/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-kube-api-access-fk4m9\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263707 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd030212-5b03-4555-b885-388260b53588-config\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-config-volume\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263779 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-plugins-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263796 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z725k\" (UniqueName: \"kubernetes.io/projected/bead92a8-42de-4171-9c0c-790d64a6d14a-kube-api-access-z725k\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263812 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2gwc\" (UniqueName: \"kubernetes.io/projected/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-kube-api-access-l2gwc\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263827 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-csi-data-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263864 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-images\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263887 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/760c21cc-aac0-45ad-9d41-94ff93b92c44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263904 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/650a32b4-d961-4805-8521-f1f24de6ad4a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263922 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-socket-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263937 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/650a32b4-d961-4805-8521-f1f24de6ad4a-config\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.263956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-metrics-tls\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.264005 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/650a32b4-d961-4805-8521-f1f24de6ad4a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.264038 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55g67\" (UniqueName: \"kubernetes.io/projected/760c21cc-aac0-45ad-9d41-94ff93b92c44-kube-api-access-55g67\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.264081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-srv-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.264793 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.764698708 +0000 UTC m=+257.339443548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.283019 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.297072 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92ctw\" (UniqueName: \"kubernetes.io/projected/bfcc4e0d-0910-42c0-bcac-44c6aee8b74d-kube-api-access-92ctw\") pod \"ingress-canary-jmmm2\" (UID: \"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d\") " pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:30 crc kubenswrapper[4778]: W0318 09:06:30.357789 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a8d0909_d7da_49bd_bd5b_0f3ca5a61637.slice/crio-d169670b40d53da31764b4cc4e5df28c3ae9d827ab8db481e9ff9be957e4d6a9 WatchSource:0}: Error finding container d169670b40d53da31764b4cc4e5df28c3ae9d827ab8db481e9ff9be957e4d6a9: Status 404 returned error can't find the container with id d169670b40d53da31764b4cc4e5df28c3ae9d827ab8db481e9ff9be957e4d6a9 Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.364869 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-images\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365013 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/760c21cc-aac0-45ad-9d41-94ff93b92c44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365044 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/650a32b4-d961-4805-8521-f1f24de6ad4a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365068 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-socket-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365083 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/650a32b4-d961-4805-8521-f1f24de6ad4a-config\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365133 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-metrics-tls\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365170 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/650a32b4-d961-4805-8521-f1f24de6ad4a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365243 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55g67\" (UniqueName: \"kubernetes.io/projected/760c21cc-aac0-45ad-9d41-94ff93b92c44-kube-api-access-55g67\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365321 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-srv-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365417 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-metrics-tls\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365438 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm74g\" (UniqueName: \"kubernetes.io/projected/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-kube-api-access-tm74g\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365487 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx9dg\" (UniqueName: \"kubernetes.io/projected/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-kube-api-access-qx9dg\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365513 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-mountpoint-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365550 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd030212-5b03-4555-b885-388260b53588-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365569 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533b6e54-efa5-4032-bebd-eedc39a834b8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365598 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsjnk\" (UniqueName: \"kubernetes.io/projected/de938bf1-1696-46c9-b6af-9a3766846e8d-kube-api-access-tsjnk\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365633 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b84z\" (UniqueName: \"kubernetes.io/projected/533b6e54-efa5-4032-bebd-eedc39a834b8-kube-api-access-8b84z\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365654 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-profile-collector-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365693 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/577d365f-ec95-4de4-a6a4-6752b2f0de56-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365714 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd030212-5b03-4555-b885-388260b53588-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365731 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/533b6e54-efa5-4032-bebd-eedc39a834b8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365785 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-registration-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365803 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-certs\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365840 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-node-bootstrap-token\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365879 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365908 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-proxy-tls\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365925 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqc9n\" (UniqueName: \"kubernetes.io/projected/577d365f-ec95-4de4-a6a4-6752b2f0de56-kube-api-access-pqc9n\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk4m9\" (UniqueName: \"kubernetes.io/projected/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-kube-api-access-fk4m9\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.365997 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd030212-5b03-4555-b885-388260b53588-config\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.366056 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-config-volume\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.366073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-plugins-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.366090 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z725k\" (UniqueName: \"kubernetes.io/projected/bead92a8-42de-4171-9c0c-790d64a6d14a-kube-api-access-z725k\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.366127 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2gwc\" (UniqueName: \"kubernetes.io/projected/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-kube-api-access-l2gwc\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.366142 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-csi-data-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.373248 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/650a32b4-d961-4805-8521-f1f24de6ad4a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.380727 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-socket-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.381443 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/650a32b4-d961-4805-8521-f1f24de6ad4a-config\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.383627 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.384211 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-mountpoint-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.387086 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.387424 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-csi-data-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.387928 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-registration-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.391352 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-config-volume\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.392372 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/760c21cc-aac0-45ad-9d41-94ff93b92c44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.393104 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.393751 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.8937253 +0000 UTC m=+257.468470130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.393787 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533b6e54-efa5-4032-bebd-eedc39a834b8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.393892 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-images\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.394956 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bead92a8-42de-4171-9c0c-790d64a6d14a-plugins-dir\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.396997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd030212-5b03-4555-b885-388260b53588-config\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.397215 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.407031 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-certs\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.413409 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bhmq7"] Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.415085 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-srv-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.417864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-metrics-tls\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.418500 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.423803 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-proxy-tls\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.427838 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-profile-collector-cert\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.428771 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/650a32b4-d961-4805-8521-f1f24de6ad4a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2zf5g\" (UID: \"650a32b4-d961-4805-8521-f1f24de6ad4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.430690 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/533b6e54-efa5-4032-bebd-eedc39a834b8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.431846 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd030212-5b03-4555-b885-388260b53588-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.432222 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/577d365f-ec95-4de4-a6a4-6752b2f0de56-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.434731 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-metrics-tls\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.440712 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx9dg\" (UniqueName: \"kubernetes.io/projected/5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c-kube-api-access-qx9dg\") pod \"machine-config-operator-74547568cd-fr65n\" (UID: \"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.441066 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/de938bf1-1696-46c9-b6af-9a3766846e8d-node-bootstrap-token\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.460265 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd030212-5b03-4555-b885-388260b53588-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5wjt2\" (UID: \"dd030212-5b03-4555-b885-388260b53588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.468172 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.468367 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.968333214 +0000 UTC m=+257.543078054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.469400 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.469861 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:30.969850324 +0000 UTC m=+257.544595244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.479125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55g67\" (UniqueName: \"kubernetes.io/projected/760c21cc-aac0-45ad-9d41-94ff93b92c44-kube-api-access-55g67\") pod \"package-server-manager-789f6589d5-vgscx\" (UID: \"760c21cc-aac0-45ad-9d41-94ff93b92c44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.490308 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" event={"ID":"8ed2be9e-a493-4ce8-aee1-83f3ae258fba","Type":"ContainerStarted","Data":"384012e39609968e40f00b85401f9373f0fcd56486e57e57ec0ce3bc4aaa8163"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.491590 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" event={"ID":"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637","Type":"ContainerStarted","Data":"d169670b40d53da31764b4cc4e5df28c3ae9d827ab8db481e9ff9be957e4d6a9"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.493392 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pgsqh" event={"ID":"5f875d21-ddf2-4d41-8be3-819c8836824a","Type":"ContainerStarted","Data":"0526269f3752c495953fd88d5da903a92103220f8039ec4c7dde34390b5f6401"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.494917 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" event={"ID":"f06790e0-cf8c-48f0-8d48-893663fdbd1c","Type":"ContainerStarted","Data":"399d10bfc0ba0e60f2a6e2b19d51a033e4c3420b3329b501676c4889b0d1fe5e"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.494945 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" event={"ID":"f06790e0-cf8c-48f0-8d48-893663fdbd1c","Type":"ContainerStarted","Data":"afb06ad3e0bcd75590f17c823bc1aff440fe73faa5a7fd84f4912126237879bd"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.495868 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nnfvg" event={"ID":"9b31b04d-28d1-4397-88b3-b26a4bb6ede9","Type":"ContainerStarted","Data":"953c1bb92c70403222e509c171da1d740d9321c8dc5b21025da97debd2f8ca77"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.497665 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" event={"ID":"35cf99cc-0bae-4b8d-b861-103e3174f081","Type":"ContainerStarted","Data":"0f3e2cd60735c7ed34516ddbf5173175e86cd453f29e51936da898bfa27f9a01"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.499542 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" event={"ID":"d0167d9e-5565-4154-80bb-3856d9b5985f","Type":"ContainerStarted","Data":"248e2d30296bd1e1d92d5277ca75c197cf7655345299c525ecce91477b057af8"} Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.500563 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqc9n\" (UniqueName: \"kubernetes.io/projected/577d365f-ec95-4de4-a6a4-6752b2f0de56-kube-api-access-pqc9n\") pod \"cluster-samples-operator-665b6dd947-txb7s\" (UID: \"577d365f-ec95-4de4-a6a4-6752b2f0de56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.511005 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.522016 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z725k\" (UniqueName: \"kubernetes.io/projected/bead92a8-42de-4171-9c0c-790d64a6d14a-kube-api-access-z725k\") pod \"csi-hostpathplugin-5gdpq\" (UID: \"bead92a8-42de-4171-9c0c-790d64a6d14a\") " pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.541746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.558185 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.559994 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.570646 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.570976 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsjnk\" (UniqueName: \"kubernetes.io/projected/de938bf1-1696-46c9-b6af-9a3766846e8d-kube-api-access-tsjnk\") pod \"machine-config-server-qqzxx\" (UID: \"de938bf1-1696-46c9-b6af-9a3766846e8d\") " pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.571093 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.071071057 +0000 UTC m=+257.645815987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.578315 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm74g\" (UniqueName: \"kubernetes.io/projected/53e4e3b9-bd77-47d8-98d7-f79849a3fc4a-kube-api-access-tm74g\") pod \"dns-default-wgbcp\" (UID: \"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a\") " pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.580056 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jmmm2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.599646 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b84z\" (UniqueName: \"kubernetes.io/projected/533b6e54-efa5-4032-bebd-eedc39a834b8-kube-api-access-8b84z\") pod \"kube-storage-version-migrator-operator-b67b599dd-x8bgh\" (UID: \"533b6e54-efa5-4032-bebd-eedc39a834b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.622142 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2gwc\" (UniqueName: \"kubernetes.io/projected/9c6e16fb-c90d-4e0d-a57e-90a778a52f97-kube-api-access-l2gwc\") pod \"catalog-operator-68c6474976-2kmb2\" (UID: \"9c6e16fb-c90d-4e0d-a57e-90a778a52f97\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.626635 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk4m9\" (UniqueName: \"kubernetes.io/projected/75e6cce3-f879-4c6b-8ef3-8d2a4feecae1-kube-api-access-fk4m9\") pod \"dns-operator-744455d44c-mlr7l\" (UID: \"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.660227 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.665951 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.672867 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.673824 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.173812109 +0000 UTC m=+257.748556949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.681720 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.687359 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.734963 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.774508 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.774724 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.274685391 +0000 UTC m=+257.849430231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.774917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.777813 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.277790475 +0000 UTC m=+257.852535315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.832475 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.852532 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.870933 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qqzxx" Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.876644 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.877005 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.376983303 +0000 UTC m=+257.951728133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.877130 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.877481 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.377474076 +0000 UTC m=+257.952218916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.978152 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.978493 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.478444881 +0000 UTC m=+258.053189721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:30 crc kubenswrapper[4778]: I0318 09:06:30.978975 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:30 crc kubenswrapper[4778]: E0318 09:06:30.979370 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.479349895 +0000 UTC m=+258.054094735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.090917 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.091388 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.591358118 +0000 UTC m=+258.166102958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.091627 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.091947 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.591939624 +0000 UTC m=+258.166684464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.152946 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-q7qs8"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.161396 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.178064 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gkpf4"] Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.193766 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.693734881 +0000 UTC m=+258.268479721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.192183 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.196720 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.197048 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.69703583 +0000 UTC m=+258.271780670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.213414 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6frtc"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.301631 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.303030 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.80300753 +0000 UTC m=+258.377752370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.354812 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ckp9s"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.373279 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.404585 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.405002 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:31.904989882 +0000 UTC m=+258.479734722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.510211 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.510564 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.010546781 +0000 UTC m=+258.585291621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.571092 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" event={"ID":"6cdf835c-58e4-4297-a247-690f407af22d","Type":"ContainerStarted","Data":"ac90e6e39e7a5f446606c85ba56fd804f2a5f81ecff1fa3d0efe068009b15849"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.582479 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" event={"ID":"8ed2be9e-a493-4ce8-aee1-83f3ae258fba","Type":"ContainerStarted","Data":"e4732188769d810ad5cc0b5c077b4adea2c1620a874dd07a965859f325d7cd4e"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.588587 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" event={"ID":"97ee6937-a1a5-42ea-a460-29d54478e633","Type":"ContainerStarted","Data":"ea27939aa8b795cbb05c9ef86fd0c0cedd8519701e620ecc91f86b4b95a08fc2"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.598272 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" event={"ID":"0a99ad6c-7819-4b33-8846-26e6ede5ce22","Type":"ContainerStarted","Data":"e27e08ba37ccea769f54c163cfe33287adcf772ae3e9703fa688112dc81941d0"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.607214 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" event={"ID":"f06790e0-cf8c-48f0-8d48-893663fdbd1c","Type":"ContainerStarted","Data":"83ed3cd2c88577c4a8ab7718302c6c52e3fe63ca877b71c85b312c9ae9680692"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.615437 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.615929 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.115911915 +0000 UTC m=+258.690656755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.619264 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" event={"ID":"80d2d01c-2b8d-49ff-adad-6b49568293a0","Type":"ContainerStarted","Data":"11d027c4ff144e592c2325bf8ed82a69537026248fc53f326a958432c477becd"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.625647 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nnfvg" event={"ID":"9b31b04d-28d1-4397-88b3-b26a4bb6ede9","Type":"ContainerStarted","Data":"d7ad50204a6e2cd1c8da5c24755ef358bd43e98cde4c732ef742b2d21292299a"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.637714 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" event={"ID":"034ec244-f99c-4c50-a55a-9b33b8b376c3","Type":"ContainerStarted","Data":"47bde142c5dd33c66938dfe740b25e3625d59cc1c96c745b6bc2929fed2803f2"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.637773 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" event={"ID":"034ec244-f99c-4c50-a55a-9b33b8b376c3","Type":"ContainerStarted","Data":"a39d0575f5d97e39ab3f8a698d70640fed87fbd9c05fe240af2a6f1ce398a1fc"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.640477 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pgsqh" event={"ID":"5f875d21-ddf2-4d41-8be3-819c8836824a","Type":"ContainerStarted","Data":"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.644620 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qqzxx" event={"ID":"de938bf1-1696-46c9-b6af-9a3766846e8d","Type":"ContainerStarted","Data":"a00499b5b2a5d6bcc09c5e31a017b82e8c9cc19431c4f72d82b3e51c6f842948"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.644689 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qqzxx" event={"ID":"de938bf1-1696-46c9-b6af-9a3766846e8d","Type":"ContainerStarted","Data":"bf99e3d12d16acdd22819c226785c08c6b53925cef7c467eb21d02d826f550b2"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.647265 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" event={"ID":"6e93d5ac-22fb-4d53-86c4-3262993f2116","Type":"ContainerStarted","Data":"6c27a915fd14451a692cf576af43979df23ddbd5e61d6a727e34a13eea5af811"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.655967 4778 generic.go:334] "Generic (PLEG): container finished" podID="35cf99cc-0bae-4b8d-b861-103e3174f081" containerID="86393bf25c39867325647f99ee03a38d004b1fe8ea252a33b08e0f4463a8615e" exitCode=0 Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.656787 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" event={"ID":"35cf99cc-0bae-4b8d-b861-103e3174f081","Type":"ContainerDied","Data":"86393bf25c39867325647f99ee03a38d004b1fe8ea252a33b08e0f4463a8615e"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.658758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" event={"ID":"918ba01d-c786-4f9a-ae58-5bcc23684c16","Type":"ContainerStarted","Data":"1c73c96ec7be59ec8398b94c1b9eef7bba25efc85f846fdcd30027fb38409c08"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.659335 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.660947 4778 patch_prober.go:28] interesting pod/console-operator-58897d9998-q7qs8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.661466 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" podUID="918ba01d-c786-4f9a-ae58-5bcc23684c16" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.669248 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" event={"ID":"a2907797-7fb3-44c0-81cf-783512fd1bf6","Type":"ContainerStarted","Data":"afbb41bf570d58818b80b3649704135d217cda0a15059129ddd38187b552fed1"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.669286 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" event={"ID":"a2907797-7fb3-44c0-81cf-783512fd1bf6","Type":"ContainerStarted","Data":"034d3d04512c50babfedd8291845f3b40baf1bac18820ecad81e04f9283eddf2"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.693184 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" event={"ID":"d0167d9e-5565-4154-80bb-3856d9b5985f","Type":"ContainerStarted","Data":"c54ebb09077fd76f31682fa7bdc8047b8510156410fb4d70d68ac82f9573d624"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.698379 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" event={"ID":"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637","Type":"ContainerStarted","Data":"e9257ea8202297fa6d8a75a0ad4f78bf4d5ee04b58490bb83b63980f5daf0cc8"} Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.698989 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.700182 4778 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-dsqlz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.700248 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.717265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.717546 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.217526677 +0000 UTC m=+258.792271517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.717990 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.719452 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.219431648 +0000 UTC m=+258.794176498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.755940 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs"] Mar 18 09:06:31 crc kubenswrapper[4778]: W0318 09:06:31.782792 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef74c17c_eb2a_4bef_b948_6b06efd76719.slice/crio-1f29d37f1112fc081e73de16be0afe9b8e06f130904067805bb112ee4fe30e1c WatchSource:0}: Error finding container 1f29d37f1112fc081e73de16be0afe9b8e06f130904067805bb112ee4fe30e1c: Status 404 returned error can't find the container with id 1f29d37f1112fc081e73de16be0afe9b8e06f130904067805bb112ee4fe30e1c Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.823316 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.823821 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.824187 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.324165044 +0000 UTC m=+258.898909884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.850780 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.883069 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tnw27"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.908312 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-btdt7"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.929419 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:31 crc kubenswrapper[4778]: E0318 09:06:31.931213 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.431181142 +0000 UTC m=+259.005925982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.959956 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:06:31 crc kubenswrapper[4778]: W0318 09:06:31.966387 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08b964cf_bfc5_4b90_83a3_0b358c3ffbc9.slice/crio-375a8c353b029593bb91ef23f773207b5d24c55a6f2c8d8b5b46e3882a7bf7d3 WatchSource:0}: Error finding container 375a8c353b029593bb91ef23f773207b5d24c55a6f2c8d8b5b46e3882a7bf7d3: Status 404 returned error can't find the container with id 375a8c353b029593bb91ef23f773207b5d24c55a6f2c8d8b5b46e3882a7bf7d3 Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.984305 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-57msj"] Mar 18 09:06:31 crc kubenswrapper[4778]: I0318 09:06:31.994913 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.006286 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-b66f7"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.030096 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.030578 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.530559375 +0000 UTC m=+259.105304215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.032367 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.036260 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.052115 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w2lf2"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.063598 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.095819 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:06:32 crc kubenswrapper[4778]: W0318 09:06:32.120913 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba84f396_0169_4d5e_a126_60ac9d6d49f8.slice/crio-e9cf4ed56d51f329fa6bbc3fc0a2995ce8217e5643b5fbead6c9478307157c13 WatchSource:0}: Error finding container e9cf4ed56d51f329fa6bbc3fc0a2995ce8217e5643b5fbead6c9478307157c13: Status 404 returned error can't find the container with id e9cf4ed56d51f329fa6bbc3fc0a2995ce8217e5643b5fbead6c9478307157c13 Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.133575 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.136671 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.140370 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.640352588 +0000 UTC m=+259.215097428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.146682 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.146764 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.158016 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.175791 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jmmm2"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.177182 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.213620 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-pgsqh" podStartSLOduration=188.213585324 podStartE2EDuration="3m8.213585324s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.187444599 +0000 UTC m=+258.762189449" watchObservedRunningTime="2026-03-18 09:06:32.213585324 +0000 UTC m=+258.788330164" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.219165 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v92lm" podStartSLOduration=188.219148334 podStartE2EDuration="3m8.219148334s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.215103134 +0000 UTC m=+258.789847974" watchObservedRunningTime="2026-03-18 09:06:32.219148334 +0000 UTC m=+258.793893174" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.238187 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.240362 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.740331755 +0000 UTC m=+259.315076595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.243110 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.243446 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.74343168 +0000 UTC m=+259.318176510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247562 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247675 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247789 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5gdpq"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247853 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247931 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.247998 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.254343 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mlr7l"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.254553 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.261147 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qqzxx" podStartSLOduration=5.261120467 podStartE2EDuration="5.261120467s" podCreationTimestamp="2026-03-18 09:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.25234496 +0000 UTC m=+258.827089800" watchObservedRunningTime="2026-03-18 09:06:32.261120467 +0000 UTC m=+258.835865317" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.267879 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:32 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:32 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:32 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.267963 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.276318 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgbcp"] Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.298909 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xwmwl" podStartSLOduration=188.298886355 podStartE2EDuration="3m8.298886355s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.297703794 +0000 UTC m=+258.872448634" watchObservedRunningTime="2026-03-18 09:06:32.298886355 +0000 UTC m=+258.873631195" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.344777 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.345177 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.845156755 +0000 UTC m=+259.419901585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: W0318 09:06:32.371736 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75e6cce3_f879_4c6b_8ef3_8d2a4feecae1.slice/crio-5f8e6fb57b55e014fff4bf0ebda0361d7586a0fec3186d8e1e4e58a550358063 WatchSource:0}: Error finding container 5f8e6fb57b55e014fff4bf0ebda0361d7586a0fec3186d8e1e4e58a550358063: Status 404 returned error can't find the container with id 5f8e6fb57b55e014fff4bf0ebda0361d7586a0fec3186d8e1e4e58a550358063 Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.379686 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" podStartSLOduration=187.379667155 podStartE2EDuration="3m7.379667155s" podCreationTimestamp="2026-03-18 09:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.379299086 +0000 UTC m=+258.954043936" watchObservedRunningTime="2026-03-18 09:06:32.379667155 +0000 UTC m=+258.954411995" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.381006 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-nnfvg" podStartSLOduration=188.381000832 podStartE2EDuration="3m8.381000832s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.338176016 +0000 UTC m=+258.912920866" watchObservedRunningTime="2026-03-18 09:06:32.381000832 +0000 UTC m=+258.955745672" Mar 18 09:06:32 crc kubenswrapper[4778]: W0318 09:06:32.389578 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e4e3b9_bd77_47d8_98d7_f79849a3fc4a.slice/crio-27677d79c821d5cb3c27fcc6e8291c62636ee81cb359205c816e39f5589c41d5 WatchSource:0}: Error finding container 27677d79c821d5cb3c27fcc6e8291c62636ee81cb359205c816e39f5589c41d5: Status 404 returned error can't find the container with id 27677d79c821d5cb3c27fcc6e8291c62636ee81cb359205c816e39f5589c41d5 Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.446854 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.447475 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:32.947462525 +0000 UTC m=+259.522207365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.456091 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" podStartSLOduration=188.456073577 podStartE2EDuration="3m8.456073577s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.42170786 +0000 UTC m=+258.996452730" watchObservedRunningTime="2026-03-18 09:06:32.456073577 +0000 UTC m=+259.030818417" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.456873 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-smtz9" podStartSLOduration=188.456869509 podStartE2EDuration="3m8.456869509s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.455829351 +0000 UTC m=+259.030574211" watchObservedRunningTime="2026-03-18 09:06:32.456869509 +0000 UTC m=+259.031614349" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.540812 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bhmq7" podStartSLOduration=188.540793194 podStartE2EDuration="3m8.540793194s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.536328353 +0000 UTC m=+259.111073203" watchObservedRunningTime="2026-03-18 09:06:32.540793194 +0000 UTC m=+259.115538034" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.547663 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.548164 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.048143242 +0000 UTC m=+259.622888082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.650491 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.651357 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.151308166 +0000 UTC m=+259.726053026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.708706 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" event={"ID":"97ee6937-a1a5-42ea-a460-29d54478e633","Type":"ContainerStarted","Data":"f1aaa8a2c1f96baaee4b7353f353a9b567635ea9eb73df19ffa50153f00a757d"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.728264 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" event={"ID":"c0f3c490-ee49-4a88-893e-132592dd6d59","Type":"ContainerStarted","Data":"e15756b13b34411edcc3f2b0d1a8832dacf6a1d93648315a7091e044b914508a"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.732945 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" podStartSLOduration=188.732931979 podStartE2EDuration="3m8.732931979s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.730497973 +0000 UTC m=+259.305242823" watchObservedRunningTime="2026-03-18 09:06:32.732931979 +0000 UTC m=+259.307676819" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.741640 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" event={"ID":"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e","Type":"ContainerStarted","Data":"d956f9a5d6b254e8d728e6b23e7ef10dfcb87defb8a9a961c3869358ea36fed7"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.752569 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.752784 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.252744984 +0000 UTC m=+259.827489824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.753415 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.755385 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.255367324 +0000 UTC m=+259.830112154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.790906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" event={"ID":"6d6ab3a6-da16-4fc8-9235-2c223661de30","Type":"ContainerStarted","Data":"6f9d22309b6e8be92e6bf6a2a403f7867a0fd457314b02c7511d1124370512b5"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.790972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" event={"ID":"6d6ab3a6-da16-4fc8-9235-2c223661de30","Type":"ContainerStarted","Data":"4f106af8b6fde6e29b85df853162f89730389f4a7ad45e58e6bc697664ae04b5"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.822922 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.824032 4778 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5vjr4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.824092 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" podUID="6d6ab3a6-da16-4fc8-9235-2c223661de30" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.831416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" event={"ID":"650a32b4-d961-4805-8521-f1f24de6ad4a","Type":"ContainerStarted","Data":"6ba048943d6cee093680f080207cfd2aadce6c1b8841ea5a02e9fd700cadc182"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.852031 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" podStartSLOduration=188.852015203 podStartE2EDuration="3m8.852015203s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.851111598 +0000 UTC m=+259.425856448" watchObservedRunningTime="2026-03-18 09:06:32.852015203 +0000 UTC m=+259.426760043" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.854253 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" event={"ID":"496a64ab-b670-4201-9238-d60415ccba17","Type":"ContainerStarted","Data":"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.854324 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" event={"ID":"496a64ab-b670-4201-9238-d60415ccba17","Type":"ContainerStarted","Data":"eca604f85fcfe59c73e6a0d9a12120a2d10108fa64f5bc3107b7c718f96ed398"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.854893 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.855157 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.855285 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.355266211 +0000 UTC m=+259.930011051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.855908 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.857075 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.35706186 +0000 UTC m=+259.931806700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.859797 4778 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hvvlz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.859842 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" podUID="496a64ab-b670-4201-9238-d60415ccba17" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.860717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" event={"ID":"d4d183f7-2762-458d-83f1-a8894c00bb82","Type":"ContainerStarted","Data":"2c72c2692ffbed3ea96bf7a41ea070b572a03640ab6550fc6a2c2b0a29b92e70"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.866211 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" event={"ID":"918ba01d-c786-4f9a-ae58-5bcc23684c16","Type":"ContainerStarted","Data":"2850f8d396e9ed13b8edb495152385907e96097a7180aa00b5627e55e496bedf"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.874831 4778 patch_prober.go:28] interesting pod/console-operator-58897d9998-q7qs8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.875275 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" podUID="918ba01d-c786-4f9a-ae58-5bcc23684c16" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.877772 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" podStartSLOduration=188.877751918 podStartE2EDuration="3m8.877751918s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.87524902 +0000 UTC m=+259.449993870" watchObservedRunningTime="2026-03-18 09:06:32.877751918 +0000 UTC m=+259.452496758" Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.887084 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgbcp" event={"ID":"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a","Type":"ContainerStarted","Data":"27677d79c821d5cb3c27fcc6e8291c62636ee81cb359205c816e39f5589c41d5"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.915385 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" event={"ID":"80d2d01c-2b8d-49ff-adad-6b49568293a0","Type":"ContainerStarted","Data":"8405f133084eb28be0cb40e1d5cc155d1b1a741b67ea35d7d14894293fadf775"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.915435 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" event={"ID":"80d2d01c-2b8d-49ff-adad-6b49568293a0","Type":"ContainerStarted","Data":"72a70033df350c0b2a2f582a79ba90cb64bdc98a6c08453c09e0ca225d0b9999"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.929396 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" event={"ID":"533b6e54-efa5-4032-bebd-eedc39a834b8","Type":"ContainerStarted","Data":"5e4f46b76ad29fc18d28d273fdddf8d428764dfc6d0c15265d6d306626152cee"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.944528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563746-b66f7" event={"ID":"c3be356e-94af-47db-a182-dd8a57024619","Type":"ContainerStarted","Data":"4d957b42c20ebb120c0681574b93e0b852f5977f6c96c78d95883a927b1e8844"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.959314 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:32 crc kubenswrapper[4778]: E0318 09:06:32.960662 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.460642345 +0000 UTC m=+260.035387185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.966758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563744-btdt7" event={"ID":"54961f10-93b0-433f-8a7d-b30d69178e9a","Type":"ContainerStarted","Data":"df77c4671fb6dc8dc3716ac3d7733190f2f2696ab30319657174e00cec76ec77"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.999844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" event={"ID":"6c0fd619-d1c2-45e7-a7cf-e784b082428f","Type":"ContainerStarted","Data":"3c28da9a3ac748f9028b1eea9e0023b277c9fe45e8f3ddff6c1db4a612169392"} Mar 18 09:06:32 crc kubenswrapper[4778]: I0318 09:06:32.999902 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" event={"ID":"6c0fd619-d1c2-45e7-a7cf-e784b082428f","Type":"ContainerStarted","Data":"1e993fccca6419958be3b799f77590edcfea874d835f2a499222ed4fbcb22114"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.002441 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" event={"ID":"9c6e16fb-c90d-4e0d-a57e-90a778a52f97","Type":"ContainerStarted","Data":"cb5c5b7566c5f874889b71b840058c068af4116eeb732133e641ee9b9473101b"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.004256 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" event={"ID":"ef74c17c-eb2a-4bef-b948-6b06efd76719","Type":"ContainerStarted","Data":"b06a880707bff57bc2b5d0e7f2a62ea0d46656d370ddbaf3e6424f3e8c4c804a"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.004322 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" event={"ID":"ef74c17c-eb2a-4bef-b948-6b06efd76719","Type":"ContainerStarted","Data":"1f29d37f1112fc081e73de16be0afe9b8e06f130904067805bb112ee4fe30e1c"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.008169 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" event={"ID":"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1","Type":"ContainerStarted","Data":"5f8e6fb57b55e014fff4bf0ebda0361d7586a0fec3186d8e1e4e58a550358063"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.028265 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gkpf4" podStartSLOduration=189.028247089 podStartE2EDuration="3m9.028247089s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:32.944600302 +0000 UTC m=+259.519345162" watchObservedRunningTime="2026-03-18 09:06:33.028247089 +0000 UTC m=+259.602991929" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.028435 4778 generic.go:334] "Generic (PLEG): container finished" podID="6e93d5ac-22fb-4d53-86c4-3262993f2116" containerID="6062ede6f155532278065abaf189fa02e386986f37b7ff7b5d5da49d03d768cd" exitCode=0 Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.029480 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ktkfs" podStartSLOduration=189.029472012 podStartE2EDuration="3m9.029472012s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.026304527 +0000 UTC m=+259.601049367" watchObservedRunningTime="2026-03-18 09:06:33.029472012 +0000 UTC m=+259.604216862" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.029681 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" event={"ID":"6e93d5ac-22fb-4d53-86c4-3262993f2116","Type":"ContainerDied","Data":"6062ede6f155532278065abaf189fa02e386986f37b7ff7b5d5da49d03d768cd"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.044632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" event={"ID":"35cf99cc-0bae-4b8d-b861-103e3174f081","Type":"ContainerStarted","Data":"a5ca2616e67739d3bb0e01c9dac1e0fb59930ed790b9bd6756bfce05da151ac6"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.046834 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jmmm2" event={"ID":"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d","Type":"ContainerStarted","Data":"89b13df349cffa3c715e5e9ed7a7c27a7e88cddbd2209e8f5bc039de0402bdf6"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.049910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" event={"ID":"0a99ad6c-7819-4b33-8846-26e6ede5ce22","Type":"ContainerStarted","Data":"95ac5b77eda346ee57bd444c0e14e698dddd7a3b6ad33e441397cb407c1eb72b"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.051163 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" event={"ID":"ba84f396-0169-4d5e-a126-60ac9d6d49f8","Type":"ContainerStarted","Data":"e9cf4ed56d51f329fa6bbc3fc0a2995ce8217e5643b5fbead6c9478307157c13"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.060850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" event={"ID":"bead92a8-42de-4171-9c0c-790d64a6d14a","Type":"ContainerStarted","Data":"45df2c1880ea6d858e0415030f4d35fcd594de872512718e5959a7fc551f1216"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.061773 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.062503 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.562488393 +0000 UTC m=+260.137233233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.066613 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tnw27" event={"ID":"08b964cf-bfc5-4b90-83a3-0b358c3ffbc9","Type":"ContainerStarted","Data":"375a8c353b029593bb91ef23f773207b5d24c55a6f2c8d8b5b46e3882a7bf7d3"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.067231 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.070983 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.071026 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.078869 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerStarted","Data":"44fcaa7d9066c5bc322cc3c475c2c95ffa382825c1c11ff0bdf59ba686b15693"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.080109 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.082870 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" event={"ID":"760c21cc-aac0-45ad-9d41-94ff93b92c44","Type":"ContainerStarted","Data":"4e2c9a289270adf41b0f5b354960fc59b8ab509feb18883435be1d79f378229a"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.082923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" event={"ID":"760c21cc-aac0-45ad-9d41-94ff93b92c44","Type":"ContainerStarted","Data":"c032f33bdab4c47b70a3c3fb83a78557194d3681d2fe59a6004d2e5e8bf1bbc5"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.087793 4778 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2hr48 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.087862 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.097291 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" event={"ID":"db81860d-bcb7-4a56-a935-544dbc4be29b","Type":"ContainerStarted","Data":"233e58e62c8d40d87963329725284bd0d629e6646b095fe46a9712b711f0c101"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.104980 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" event={"ID":"dd030212-5b03-4555-b885-388260b53588","Type":"ContainerStarted","Data":"9db719568e55e34bbbcd8687188ce7d4fc887351f3b43cbdeabef0886d1784d1"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.108357 4778 generic.go:334] "Generic (PLEG): container finished" podID="6cdf835c-58e4-4297-a247-690f407af22d" containerID="460b0e8d587363cdd40956bfc336c4113dd9579cf07f3fe9c6ec33feb5cbe8d0" exitCode=0 Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.108415 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" event={"ID":"6cdf835c-58e4-4297-a247-690f407af22d","Type":"ContainerDied","Data":"460b0e8d587363cdd40956bfc336c4113dd9579cf07f3fe9c6ec33feb5cbe8d0"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.117168 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" podStartSLOduration=188.117153328 podStartE2EDuration="3m8.117153328s" podCreationTimestamp="2026-03-18 09:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.116576153 +0000 UTC m=+259.691320993" watchObservedRunningTime="2026-03-18 09:06:33.117153328 +0000 UTC m=+259.691898168" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.142058 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6frtc" podStartSLOduration=188.1420402 podStartE2EDuration="3m8.1420402s" podCreationTimestamp="2026-03-18 09:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.141909146 +0000 UTC m=+259.716654006" watchObservedRunningTime="2026-03-18 09:06:33.1420402 +0000 UTC m=+259.716785040" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.149774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" event={"ID":"46167450-7100-4ac9-a9dd-e678eb3d8677","Type":"ContainerStarted","Data":"c5603b8c9cf3c8d4db15b72d831907b4f4e8bf169cb9a4b076037f037d6f770e"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.160739 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" event={"ID":"c851677f-703c-404c-801c-064cc6bf3979","Type":"ContainerStarted","Data":"a98098a4667e740b4622df5432b1ded5001b5d2b904ba90ebd391d3be8a341e8"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.163376 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.166309 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.666262824 +0000 UTC m=+260.241007674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.182248 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" event={"ID":"8ed2be9e-a493-4ce8-aee1-83f3ae258fba","Type":"ContainerStarted","Data":"3e5630c30cd6b602b691ad31aec81b3cbbbd0649c5615b1bdc162de71319ba73"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.224277 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-tnw27" podStartSLOduration=189.224259519 podStartE2EDuration="3m9.224259519s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.192904172 +0000 UTC m=+259.767649022" watchObservedRunningTime="2026-03-18 09:06:33.224259519 +0000 UTC m=+259.799004359" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.225477 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lbkl2" podStartSLOduration=189.225470121 podStartE2EDuration="3m9.225470121s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.223250521 +0000 UTC m=+259.797995381" watchObservedRunningTime="2026-03-18 09:06:33.225470121 +0000 UTC m=+259.800214961" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.225753 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" event={"ID":"0b244100-6e88-4ba2-b656-83b6e31d23c8","Type":"ContainerStarted","Data":"5cdf4d5e0704cfdc77fd7807adc7324c637483d9eaea5640cc19ba4e5e95b39b"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.248622 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" podStartSLOduration=189.248603375 podStartE2EDuration="3m9.248603375s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.248142443 +0000 UTC m=+259.822887293" watchObservedRunningTime="2026-03-18 09:06:33.248603375 +0000 UTC m=+259.823348215" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.258664 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:33 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:33 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:33 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.258715 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.259941 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" event={"ID":"577d365f-ec95-4de4-a6a4-6752b2f0de56","Type":"ContainerStarted","Data":"657cdd1a7127ceffd000c9baa751dc82fbbc67cf208f9eae19b01b841a3c1ce2"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.264688 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" event={"ID":"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c","Type":"ContainerStarted","Data":"0c85e5526ff3d3aad837f9a0130f64206266064ce5fbb75b5de406069d316289"} Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.268149 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.271018 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.77099826 +0000 UTC m=+260.345743100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.276682 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.303491 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" podStartSLOduration=189.303439516 podStartE2EDuration="3m9.303439516s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:33.29728644 +0000 UTC m=+259.872031280" watchObservedRunningTime="2026-03-18 09:06:33.303439516 +0000 UTC m=+259.878184356" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.369495 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.370912 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.870864145 +0000 UTC m=+260.445608985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.471367 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.472147 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:33.972120048 +0000 UTC m=+260.546864888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.572723 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.573526 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.073501073 +0000 UTC m=+260.648245923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.574977 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.575611 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.0755983 +0000 UTC m=+260.650343140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.682873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.683404 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.18338436 +0000 UTC m=+260.758129200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.784407 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.785476 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.285459104 +0000 UTC m=+260.860203954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.892025 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51694: no serving certificate available for the kubelet" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.892874 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.893046 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.393014336 +0000 UTC m=+260.967759176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.893221 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.893537 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.393525091 +0000 UTC m=+260.968269931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.935393 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51700: no serving certificate available for the kubelet" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.988467 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51714: no serving certificate available for the kubelet" Mar 18 09:06:33 crc kubenswrapper[4778]: I0318 09:06:33.994177 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:33 crc kubenswrapper[4778]: E0318 09:06:33.994562 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.494542007 +0000 UTC m=+261.069286847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.095976 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.096440 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.596420146 +0000 UTC m=+261.171165006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.098635 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51724: no serving certificate available for the kubelet" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.196823 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.197389 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.697147495 +0000 UTC m=+261.271892335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.198387 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.199402 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.699390815 +0000 UTC m=+261.274135655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.208725 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51732: no serving certificate available for the kubelet" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.263174 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:34 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:34 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:34 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.263244 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.300917 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.301650 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.801626384 +0000 UTC m=+261.376371224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.353513 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51746: no serving certificate available for the kubelet" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.368370 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" event={"ID":"ba84f396-0169-4d5e-a126-60ac9d6d49f8","Type":"ContainerStarted","Data":"f48719cbac1747bc224535092fc3d4ad7429d42e033cd378c6065bf7d1519bf0"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.403861 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.405027 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:34.905014514 +0000 UTC m=+261.479759354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.426792 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" event={"ID":"6c0fd619-d1c2-45e7-a7cf-e784b082428f","Type":"ContainerStarted","Data":"e2a2e8f9c8a6f8bc53374676c3f047ded50753ac8397e7b209e932507b5d89ff"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.451034 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" event={"ID":"dd030212-5b03-4555-b885-388260b53588","Type":"ContainerStarted","Data":"ab5434042669231d2fdb14802eec8a0590443d2bd3a9e65c7809088a2d527275"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.458676 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51754: no serving certificate available for the kubelet" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.491510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerStarted","Data":"90afc0123cf376859383c4f842d0587910af2413023c90f78391ff8bd752a9d6"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.492684 4778 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2hr48 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.492724 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.506877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.512634 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.012611008 +0000 UTC m=+261.587355848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.540903 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" event={"ID":"760c21cc-aac0-45ad-9d41-94ff93b92c44","Type":"ContainerStarted","Data":"f2e77ec04f3d30cbfdb3b3ad7a2a1c69dced720b5b7ef0ff2e17a63a024918aa"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.541266 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.573960 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jmmm2" event={"ID":"bfcc4e0d-0910-42c0-bcac-44c6aee8b74d","Type":"ContainerStarted","Data":"74c70a3ef48017cd5b04af0e4eaab181c5cbc27e8548f84ceffd37b86c746659"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.581299 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" event={"ID":"46167450-7100-4ac9-a9dd-e678eb3d8677","Type":"ContainerStarted","Data":"a3257953897e0bdf8b7d813090d9374915df516a5d6d321cb7c31e54733e6c34"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.609024 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" event={"ID":"9c6e16fb-c90d-4e0d-a57e-90a778a52f97","Type":"ContainerStarted","Data":"744bb67aaefa8b621cba0537144fe11c395b132cc4169af137251761fd3444de"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.610345 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.614990 4778 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2kmb2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.615043 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" podUID="9c6e16fb-c90d-4e0d-a57e-90a778a52f97" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.615445 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.615694 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.615708 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.617702 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.117689134 +0000 UTC m=+261.692433974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.629337 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" event={"ID":"6e93d5ac-22fb-4d53-86c4-3262993f2116","Type":"ContainerStarted","Data":"81d08bac258f86cd207d727ee8984390039e2d5e7db40deb32f16fc2e0383e62"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.645146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" event={"ID":"6cdf835c-58e4-4297-a247-690f407af22d","Type":"ContainerStarted","Data":"d6435614874e1ed1701b0989ff36d1da587619d778214ac17a7fda1ce1a089d0"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.645250 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.648251 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51762: no serving certificate available for the kubelet" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.675293 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" event={"ID":"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e","Type":"ContainerStarted","Data":"f149547c282b14060ac7f4f965632cabec14a5eedeb495fbdb314b9a9c453412"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.675356 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" event={"ID":"e24d15f2-56e5-4fcc-91ab-370d7b4fb41e","Type":"ContainerStarted","Data":"76f95dceb354314ec5566a2edabbce64ad7d245b0d1999d6af744d80ece10efc"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.701289 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" event={"ID":"d4d183f7-2762-458d-83f1-a8894c00bb82","Type":"ContainerStarted","Data":"d8e3b8f1eeb4e7cc07f6bcdb3906e975c0dd5306fa698d44586fa07838ab827b"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.702320 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.705494 4778 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8xcgz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.705546 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" podUID="d4d183f7-2762-458d-83f1-a8894c00bb82" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.716970 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.717530 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.217493547 +0000 UTC m=+261.792238387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.728445 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nd44t" event={"ID":"0b244100-6e88-4ba2-b656-83b6e31d23c8","Type":"ContainerStarted","Data":"4d6f482e5067ec90c4ea0074fb58358982b91995483dd47289604a192d768c4e"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.734047 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgbcp" event={"ID":"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a","Type":"ContainerStarted","Data":"9085a1b00021acac631526ffa76ad8618e99f3800c908b6b2383d08dc568f119"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.734462 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.758910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" event={"ID":"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c","Type":"ContainerStarted","Data":"8e1aa81dfc8b447ee04c3b269f8dcbf8b941fbb0ec6570237b3b56acc075ba55"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.758994 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" event={"ID":"5d71c2f9-dbff-4dfc-a4eb-fa8e4bfcff9c","Type":"ContainerStarted","Data":"ac68bb4122316bd4d7fe7980d410d3c58a422fe23f763a5e088cd2b1cc0e1a91"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.770263 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" event={"ID":"db81860d-bcb7-4a56-a935-544dbc4be29b","Type":"ContainerStarted","Data":"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.771272 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.773446 4778 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-x5dpv container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.773494 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.780622 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" event={"ID":"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1","Type":"ContainerStarted","Data":"79704d94de9aa153dd57524e4d810f280451536dd38cc2b552fcc48a1f9872f6"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.785101 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" event={"ID":"577d365f-ec95-4de4-a6a4-6752b2f0de56","Type":"ContainerStarted","Data":"0176028c7410d8701508c5ab47491d104223de9feb14dbb0bacf2f787d088492"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.785131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" event={"ID":"577d365f-ec95-4de4-a6a4-6752b2f0de56","Type":"ContainerStarted","Data":"29451a09a06a417a40c1535b5d09b9a0a017c7f54f13ec834c02a89e57f76aa2"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.795406 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" event={"ID":"c851677f-703c-404c-801c-064cc6bf3979","Type":"ContainerStarted","Data":"c2a4c69de94a95a7dbd5dc126644dda94b3a295c364349e9470f5db1da3a11af"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.795484 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" event={"ID":"c851677f-703c-404c-801c-064cc6bf3979","Type":"ContainerStarted","Data":"3500daf72b80b03301eaa0dcbba09ffb87b9e843a02d241b91b303efcf884998"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.814050 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tnw27" event={"ID":"08b964cf-bfc5-4b90-83a3-0b358c3ffbc9","Type":"ContainerStarted","Data":"87ff1b2ebb4914fb3d5293578eb6aa3548e770b06889ecab84b124d550e03bd7"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.815384 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.815449 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.819498 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.821100 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.321064363 +0000 UTC m=+261.895809383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.826700 4778 generic.go:334] "Generic (PLEG): container finished" podID="97ee6937-a1a5-42ea-a460-29d54478e633" containerID="f1aaa8a2c1f96baaee4b7353f353a9b567635ea9eb73df19ffa50153f00a757d" exitCode=0 Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.826799 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" event={"ID":"97ee6937-a1a5-42ea-a460-29d54478e633","Type":"ContainerDied","Data":"f1aaa8a2c1f96baaee4b7353f353a9b567635ea9eb73df19ffa50153f00a757d"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.837539 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.853799 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" event={"ID":"533b6e54-efa5-4032-bebd-eedc39a834b8","Type":"ContainerStarted","Data":"de648b1451437bfd2cf21099e5ef5411d37ad2f6b7961bc2fa181e102884c69e"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.875288 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" event={"ID":"c0f3c490-ee49-4a88-893e-132592dd6d59","Type":"ContainerStarted","Data":"9e81a63a52fef7b87060ca33b437dfff7b2ec9de44df0d6374004cb744d97639"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.902310 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" event={"ID":"650a32b4-d961-4805-8521-f1f24de6ad4a","Type":"ContainerStarted","Data":"30ddffa2e12edfbd4af225fd9e9418503669f077fddea1f2d41d174cad195c40"} Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.921885 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-q7qs8" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.922007 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7xphq" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.922652 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:34 crc kubenswrapper[4778]: E0318 09:06:34.924604 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.424575156 +0000 UTC m=+261.999319996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.928617 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.937678 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-57msj" podStartSLOduration=189.937657918 podStartE2EDuration="3m9.937657918s" podCreationTimestamp="2026-03-18 09:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:34.929502638 +0000 UTC m=+261.504247498" watchObservedRunningTime="2026-03-18 09:06:34.937657918 +0000 UTC m=+261.512402758" Mar 18 09:06:34 crc kubenswrapper[4778]: I0318 09:06:34.976598 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" podStartSLOduration=190.976577659 podStartE2EDuration="3m10.976577659s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:34.970705571 +0000 UTC m=+261.545450421" watchObservedRunningTime="2026-03-18 09:06:34.976577659 +0000 UTC m=+261.551322499" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.009952 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" podStartSLOduration=191.009925459 podStartE2EDuration="3m11.009925459s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.007185755 +0000 UTC m=+261.581930595" watchObservedRunningTime="2026-03-18 09:06:35.009925459 +0000 UTC m=+261.584670299" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.025100 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.027071 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.527052181 +0000 UTC m=+262.101797021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.106952 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qtggn" podStartSLOduration=191.106933627 podStartE2EDuration="3m11.106933627s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.055479428 +0000 UTC m=+261.630224268" watchObservedRunningTime="2026-03-18 09:06:35.106933627 +0000 UTC m=+261.681678467" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.130810 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.131356 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.631332425 +0000 UTC m=+262.206077265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.194032 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x8bgh" podStartSLOduration=191.194014888 podStartE2EDuration="3m11.194014888s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.141667354 +0000 UTC m=+261.716412204" watchObservedRunningTime="2026-03-18 09:06:35.194014888 +0000 UTC m=+261.768759718" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.221954 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jmmm2" podStartSLOduration=8.221936079 podStartE2EDuration="8.221936079s" podCreationTimestamp="2026-03-18 09:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.193514114 +0000 UTC m=+261.768258954" watchObservedRunningTime="2026-03-18 09:06:35.221936079 +0000 UTC m=+261.796680919" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.224724 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fr65n" podStartSLOduration=191.224711488 podStartE2EDuration="3m11.224711488s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.22374662 +0000 UTC m=+261.798491490" watchObservedRunningTime="2026-03-18 09:06:35.224711488 +0000 UTC m=+261.799456328" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.237022 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.237543 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.737528781 +0000 UTC m=+262.312273621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.265810 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:35 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:35 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:35 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.266219 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.268671 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbw52" podStartSLOduration=191.268647613 podStartE2EDuration="3m11.268647613s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.26640881 +0000 UTC m=+261.841153670" watchObservedRunningTime="2026-03-18 09:06:35.268647613 +0000 UTC m=+261.843392453" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.325496 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wjt2" podStartSLOduration=191.325474504 podStartE2EDuration="3m11.325474504s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.322650764 +0000 UTC m=+261.897395614" watchObservedRunningTime="2026-03-18 09:06:35.325474504 +0000 UTC m=+261.900219344" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.338660 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.339135 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.839117021 +0000 UTC m=+262.413861861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.349643 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51774: no serving certificate available for the kubelet" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.379444 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wgbcp" podStartSLOduration=8.379419842 podStartE2EDuration="8.379419842s" podCreationTimestamp="2026-03-18 09:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.364678805 +0000 UTC m=+261.939423655" watchObservedRunningTime="2026-03-18 09:06:35.379419842 +0000 UTC m=+261.954164682" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.440762 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.441160 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:35.941144422 +0000 UTC m=+262.515889262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.465626 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-txb7s" podStartSLOduration=191.465594465 podStartE2EDuration="3m11.465594465s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.396540847 +0000 UTC m=+261.971285707" watchObservedRunningTime="2026-03-18 09:06:35.465594465 +0000 UTC m=+262.040339305" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.468352 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-t27k4" podStartSLOduration=191.468339223 podStartE2EDuration="3m11.468339223s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.464151863 +0000 UTC m=+262.038896723" watchObservedRunningTime="2026-03-18 09:06:35.468339223 +0000 UTC m=+262.043084063" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.549066 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.549525 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.049502023 +0000 UTC m=+262.624246863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.629109 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" podStartSLOduration=191.629085658 podStartE2EDuration="3m11.629085658s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.55012759 +0000 UTC m=+262.124872450" watchObservedRunningTime="2026-03-18 09:06:35.629085658 +0000 UTC m=+262.203830498" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.650761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.651182 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.151162184 +0000 UTC m=+262.725907114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.752252 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.752699 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.252678231 +0000 UTC m=+262.827423071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.785247 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" podStartSLOduration=191.785226603 podStartE2EDuration="3m11.785226603s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.741538565 +0000 UTC m=+262.316283415" watchObservedRunningTime="2026-03-18 09:06:35.785226603 +0000 UTC m=+262.359971453" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.848958 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" podStartSLOduration=191.848937399 podStartE2EDuration="3m11.848937399s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.846344356 +0000 UTC m=+262.421089216" watchObservedRunningTime="2026-03-18 09:06:35.848937399 +0000 UTC m=+262.423682239" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.854083 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.854510 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.354497857 +0000 UTC m=+262.929242697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.903040 4778 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5vjr4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.903095 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" podUID="6d6ab3a6-da16-4fc8-9235-2c223661de30" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.951271 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kfkmc" podStartSLOduration=191.951251248 podStartE2EDuration="3m11.951251248s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:35.949607382 +0000 UTC m=+262.524352212" watchObservedRunningTime="2026-03-18 09:06:35.951251248 +0000 UTC m=+262.525996098" Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.960484 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:35 crc kubenswrapper[4778]: E0318 09:06:35.960876 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.460859501 +0000 UTC m=+263.035604341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.961740 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mlr7l" event={"ID":"75e6cce3-f879-4c6b-8ef3-8d2a4feecae1","Type":"ContainerStarted","Data":"1586c0f8095bd4b0778e1eaa506a94ee357b4e41e6cce14f56aeb6979aa14e9e"} Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.964073 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" event={"ID":"bead92a8-42de-4171-9c0c-790d64a6d14a","Type":"ContainerStarted","Data":"a73e7a24ecc52d2a54699601ef8af95aa314f9c148ba5b1bfe507fec80d5d712"} Mar 18 09:06:35 crc kubenswrapper[4778]: I0318 09:06:35.994046 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgbcp" event={"ID":"53e4e3b9-bd77-47d8-98d7-f79849a3fc4a","Type":"ContainerStarted","Data":"0f4f22aa2e3ff30f359fe0bdbfef9cbdb9bcb5f527d4e861914eba630b255cfa"} Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.013450 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" event={"ID":"6e93d5ac-22fb-4d53-86c4-3262993f2116","Type":"ContainerStarted","Data":"6b1790a943bd13c2e8ea1bfdef6a3662c2a1ab842f958c1845c6e6743882c61a"} Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.017525 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.036138 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.052995 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.056564 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2kmb2" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.058397 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.059573 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.062810 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.077883 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.577859496 +0000 UTC m=+263.152604336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.081607 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-w2lf2" podStartSLOduration=192.081569242 podStartE2EDuration="3m12.081569242s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:36.040612891 +0000 UTC m=+262.615357731" watchObservedRunningTime="2026-03-18 09:06:36.081569242 +0000 UTC m=+262.656314072" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.158799 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xcgz" podStartSLOduration=192.15877778 podStartE2EDuration="3m12.15877778s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:36.108474224 +0000 UTC m=+262.683219074" watchObservedRunningTime="2026-03-18 09:06:36.15877778 +0000 UTC m=+262.733522620" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.171641 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.173769 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.673742115 +0000 UTC m=+263.248486965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.235889 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vjr4" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.268497 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:36 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:36 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:36 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.268563 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.273767 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.274533 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.774521691 +0000 UTC m=+263.349266531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.381543 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.382577 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.882556242 +0000 UTC m=+263.457301082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.473453 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zf5g" podStartSLOduration=192.473434948 podStartE2EDuration="3m12.473434948s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:36.365762516 +0000 UTC m=+262.940507356" watchObservedRunningTime="2026-03-18 09:06:36.473434948 +0000 UTC m=+263.048179788" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.486706 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.487331 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:36.987320251 +0000 UTC m=+263.562065091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.491278 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" podStartSLOduration=192.491257133 podStartE2EDuration="3m12.491257133s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:36.485769457 +0000 UTC m=+263.060514307" watchObservedRunningTime="2026-03-18 09:06:36.491257133 +0000 UTC m=+263.066001973" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.494449 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.587803 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.588262 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.088243472 +0000 UTC m=+263.662988312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.601153 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.601352 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerName="route-controller-manager" containerID="cri-o://e9257ea8202297fa6d8a75a0ad4f78bf4d5ee04b58490bb83b63980f5daf0cc8" gracePeriod=30 Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.672909 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.690225 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.690649 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.190632363 +0000 UTC m=+263.765377213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.774764 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51778: no serving certificate available for the kubelet" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.790734 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9n22\" (UniqueName: \"kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22\") pod \"97ee6937-a1a5-42ea-a460-29d54478e633\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.790832 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume\") pod \"97ee6937-a1a5-42ea-a460-29d54478e633\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.790900 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") pod \"97ee6937-a1a5-42ea-a460-29d54478e633\" (UID: \"97ee6937-a1a5-42ea-a460-29d54478e633\") " Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.791075 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.791429 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.291411759 +0000 UTC m=+263.866156589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.792154 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume" (OuterVolumeSpecName: "config-volume") pod "97ee6937-a1a5-42ea-a460-29d54478e633" (UID: "97ee6937-a1a5-42ea-a460-29d54478e633"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.815911 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22" (OuterVolumeSpecName: "kube-api-access-h9n22") pod "97ee6937-a1a5-42ea-a460-29d54478e633" (UID: "97ee6937-a1a5-42ea-a460-29d54478e633"). InnerVolumeSpecName "kube-api-access-h9n22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.817133 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "97ee6937-a1a5-42ea-a460-29d54478e633" (UID: "97ee6937-a1a5-42ea-a460-29d54478e633"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.893311 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.893681 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ee6937-a1a5-42ea-a460-29d54478e633-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.893697 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9n22\" (UniqueName: \"kubernetes.io/projected/97ee6937-a1a5-42ea-a460-29d54478e633-kube-api-access-h9n22\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.893710 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ee6937-a1a5-42ea-a460-29d54478e633-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.893984 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.393972516 +0000 UTC m=+263.968717356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.894485 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8f6hb" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.977021 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.977242 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ee6937-a1a5-42ea-a460-29d54478e633" containerName="collect-profiles" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.977255 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ee6937-a1a5-42ea-a460-29d54478e633" containerName="collect-profiles" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.977344 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ee6937-a1a5-42ea-a460-29d54478e633" containerName="collect-profiles" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.977950 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.992244 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 09:06:36 crc kubenswrapper[4778]: I0318 09:06:36.995949 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:36 crc kubenswrapper[4778]: E0318 09:06:36.996325 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.496304996 +0000 UTC m=+264.071049836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.021429 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.027995 4778 generic.go:334] "Generic (PLEG): container finished" podID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerID="e9257ea8202297fa6d8a75a0ad4f78bf4d5ee04b58490bb83b63980f5daf0cc8" exitCode=0 Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.028058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" event={"ID":"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637","Type":"ContainerDied","Data":"e9257ea8202297fa6d8a75a0ad4f78bf4d5ee04b58490bb83b63980f5daf0cc8"} Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.035691 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.037561 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl" event={"ID":"97ee6937-a1a5-42ea-a460-29d54478e633","Type":"ContainerDied","Data":"ea27939aa8b795cbb05c9ef86fd0c0cedd8519701e620ecc91f86b4b95a08fc2"} Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.037609 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea27939aa8b795cbb05c9ef86fd0c0cedd8519701e620ecc91f86b4b95a08fc2" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.062897 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" podUID="496a64ab-b670-4201-9238-d60415ccba17" containerName="controller-manager" containerID="cri-o://f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0" gracePeriod=30 Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.106754 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.606735576 +0000 UTC m=+264.181480416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.109281 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.109755 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.110014 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6c42\" (UniqueName: \"kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.117694 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.145008 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.146172 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.149460 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.166671 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.220751 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221022 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221069 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221115 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6c42\" (UniqueName: \"kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221139 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjz8f\" (UniqueName: \"kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221158 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221177 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.221335 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.721316303 +0000 UTC m=+264.296061143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221710 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.221915 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.266761 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6c42\" (UniqueName: \"kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42\") pod \"community-operators-tbbtb\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.269342 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:37 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:37 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:37 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.269396 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.290818 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.291870 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.296467 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.308557 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.343063 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.343374 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjz8f\" (UniqueName: \"kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.343393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.343425 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.343821 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.344092 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.844075552 +0000 UTC m=+264.418820392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.355871 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.384693 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjz8f\" (UniqueName: \"kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f\") pod \"certified-operators-qvn4w\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.448911 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.449126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.449172 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfhg5\" (UniqueName: \"kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.449210 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.449377 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:37.949360546 +0000 UTC m=+264.524105376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.485647 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.492812 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.492891 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.501011 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.540664 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.551134 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfhg5\" (UniqueName: \"kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.551179 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.551306 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.551337 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.551736 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.552423 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.552637 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.052618813 +0000 UTC m=+264.627363643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.582146 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfhg5\" (UniqueName: \"kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5\") pod \"community-operators-lzrtd\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.660788 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc9zp\" (UniqueName: \"kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp\") pod \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.660897 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config\") pod \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.660930 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca\") pod \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.660960 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert\") pod \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\" (UID: \"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.661097 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.661289 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.661401 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mff6k\" (UniqueName: \"kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.661445 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.661601 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.161579991 +0000 UTC m=+264.736324831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.662544 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca" (OuterVolumeSpecName: "client-ca") pod "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" (UID: "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.673890 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp" (OuterVolumeSpecName: "kube-api-access-hc9zp") pod "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" (UID: "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637"). InnerVolumeSpecName "kube-api-access-hc9zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.674185 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.680599 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config" (OuterVolumeSpecName: "config") pod "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" (UID: "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.692450 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" (UID: "6a8d0909-d7da-49bd-bd5b-0f3ca5a61637"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.719250 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.768684 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config\") pod \"496a64ab-b670-4201-9238-d60415ccba17\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.768767 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles\") pod \"496a64ab-b670-4201-9238-d60415ccba17\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.768792 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert\") pod \"496a64ab-b670-4201-9238-d60415ccba17\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.768835 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df4jz\" (UniqueName: \"kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz\") pod \"496a64ab-b670-4201-9238-d60415ccba17\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.768880 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca\") pod \"496a64ab-b670-4201-9238-d60415ccba17\" (UID: \"496a64ab-b670-4201-9238-d60415ccba17\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769040 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769083 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769119 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769222 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mff6k\" (UniqueName: \"kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769262 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769273 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769282 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.769291 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc9zp\" (UniqueName: \"kubernetes.io/projected/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637-kube-api-access-hc9zp\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.770459 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config" (OuterVolumeSpecName: "config") pod "496a64ab-b670-4201-9238-d60415ccba17" (UID: "496a64ab-b670-4201-9238-d60415ccba17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.770936 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "496a64ab-b670-4201-9238-d60415ccba17" (UID: "496a64ab-b670-4201-9238-d60415ccba17"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.773751 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.774128 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.27410925 +0000 UTC m=+264.848854090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.774208 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca" (OuterVolumeSpecName: "client-ca") pod "496a64ab-b670-4201-9238-d60415ccba17" (UID: "496a64ab-b670-4201-9238-d60415ccba17"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.774490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.780742 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz" (OuterVolumeSpecName: "kube-api-access-df4jz") pod "496a64ab-b670-4201-9238-d60415ccba17" (UID: "496a64ab-b670-4201-9238-d60415ccba17"). InnerVolumeSpecName "kube-api-access-df4jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.781619 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496a64ab-b670-4201-9238-d60415ccba17" (UID: "496a64ab-b670-4201-9238-d60415ccba17"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.781893 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.782122 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerName="route-controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.782135 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerName="route-controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.782150 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496a64ab-b670-4201-9238-d60415ccba17" containerName="controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.782156 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="496a64ab-b670-4201-9238-d60415ccba17" containerName="controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.795764 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" containerName="route-controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.795830 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="496a64ab-b670-4201-9238-d60415ccba17" containerName="controller-manager" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.796363 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.812568 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.826829 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mff6k\" (UniqueName: \"kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k\") pod \"certified-operators-zwknx\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.867590 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.871829 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d5kq\" (UniqueName: \"kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872139 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872159 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872296 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872337 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872353 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872362 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496a64ab-b670-4201-9238-d60415ccba17-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872371 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df4jz\" (UniqueName: \"kubernetes.io/projected/496a64ab-b670-4201-9238-d60415ccba17-kube-api-access-df4jz\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.872382 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/496a64ab-b670-4201-9238-d60415ccba17-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.872460 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.372443777 +0000 UTC m=+264.947188617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: W0318 09:06:37.932513 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4cfa2f4_0114_46ae_a89f_3b2eac3ea0fa.slice/crio-a612db4db085f0867c6a8718b7e590183f11921029875dedeb88f07468c98457 WatchSource:0}: Error finding container a612db4db085f0867c6a8718b7e590183f11921029875dedeb88f07468c98457: Status 404 returned error can't find the container with id a612db4db085f0867c6a8718b7e590183f11921029875dedeb88f07468c98457 Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.940819 4778 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.972916 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.972968 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d5kq\" (UniqueName: \"kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.972993 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.973014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.973061 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:37 crc kubenswrapper[4778]: E0318 09:06:37.973342 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.473330796 +0000 UTC m=+265.048075636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.974750 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.988913 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.989158 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:37 crc kubenswrapper[4778]: I0318 09:06:37.991469 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d5kq\" (UniqueName: \"kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq\") pod \"route-controller-manager-67677f775c-zxrmx\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.028267 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.055698 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:06:38 crc kubenswrapper[4778]: W0318 09:06:38.058669 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3618fc0f_e8b2_4476_a24d_662165a04ecc.slice/crio-f49ddb65c41adfb45d65de1198b0f82574583119a6d4929d1ffa55ce9f770dcc WatchSource:0}: Error finding container f49ddb65c41adfb45d65de1198b0f82574583119a6d4929d1ffa55ce9f770dcc: Status 404 returned error can't find the container with id f49ddb65c41adfb45d65de1198b0f82574583119a6d4929d1ffa55ce9f770dcc Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.064489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerStarted","Data":"a612db4db085f0867c6a8718b7e590183f11921029875dedeb88f07468c98457"} Mar 18 09:06:38 crc kubenswrapper[4778]: W0318 09:06:38.067788 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded8eaf37_d7fe_43d1_8d20_fffdd71748cc.slice/crio-5c2b4b3fdbc29641b0fd4b628d894c39c32fe66c2451fed6064a04a8d6f0eddd WatchSource:0}: Error finding container 5c2b4b3fdbc29641b0fd4b628d894c39c32fe66c2451fed6064a04a8d6f0eddd: Status 404 returned error can't find the container with id 5c2b4b3fdbc29641b0fd4b628d894c39c32fe66c2451fed6064a04a8d6f0eddd Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.069805 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" event={"ID":"6a8d0909-d7da-49bd-bd5b-0f3ca5a61637","Type":"ContainerDied","Data":"d169670b40d53da31764b4cc4e5df28c3ae9d827ab8db481e9ff9be957e4d6a9"} Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.069849 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.069878 4778 scope.go:117] "RemoveContainer" containerID="e9257ea8202297fa6d8a75a0ad4f78bf4d5ee04b58490bb83b63980f5daf0cc8" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.073703 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.073976 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.573944147 +0000 UTC m=+265.148688987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.074047 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.075234 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.575144911 +0000 UTC m=+265.149889741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.077666 4778 generic.go:334] "Generic (PLEG): container finished" podID="496a64ab-b670-4201-9238-d60415ccba17" containerID="f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0" exitCode=0 Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.077857 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.077863 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" event={"ID":"496a64ab-b670-4201-9238-d60415ccba17","Type":"ContainerDied","Data":"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0"} Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.077933 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvvlz" event={"ID":"496a64ab-b670-4201-9238-d60415ccba17","Type":"ContainerDied","Data":"eca604f85fcfe59c73e6a0d9a12120a2d10108fa64f5bc3107b7c718f96ed398"} Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.091500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" event={"ID":"bead92a8-42de-4171-9c0c-790d64a6d14a","Type":"ContainerStarted","Data":"12ed8de0e78c44d58d65f6ebd30bafe5fcfe8674978d824ea5bd048d38599130"} Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.116462 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.121537 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.124000 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsqlz"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.136069 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.136643 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvvlz"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.146177 4778 scope.go:117] "RemoveContainer" containerID="f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.148382 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.175787 4778 scope.go:117] "RemoveContainer" containerID="f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.175869 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.176090 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.676064511 +0000 UTC m=+265.250809351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.176250 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.177209 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0\": container with ID starting with f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0 not found: ID does not exist" containerID="f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0" Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.177269 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.677256005 +0000 UTC m=+265.252000845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.177318 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0"} err="failed to get container status \"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0\": rpc error: code = NotFound desc = could not find container \"f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0\": container with ID starting with f25176deafd4eee42c27d316f399c7b168c2e90f46a65e8baeeb1be52dac1df0 not found: ID does not exist" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.218314 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496a64ab-b670-4201-9238-d60415ccba17" path="/var/lib/kubelet/pods/496a64ab-b670-4201-9238-d60415ccba17/volumes" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.218835 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8d0909-d7da-49bd-bd5b-0f3ca5a61637" path="/var/lib/kubelet/pods/6a8d0909-d7da-49bd-bd5b-0f3ca5a61637/volumes" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.257255 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:38 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:38 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:38 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.257309 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.278378 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.278574 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.778544566 +0000 UTC m=+265.353289406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.278877 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.279324 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.779306037 +0000 UTC m=+265.354050877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.379924 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.380164 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.880118394 +0000 UTC m=+265.454863224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.381040 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: E0318 09:06:38.381495 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 09:06:38.881485004 +0000 UTC m=+265.456229844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nkbq4" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.438739 4778 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.438775 4778 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.433038 4778 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T09:06:37.940856685Z","Handler":null,"Name":""} Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.445141 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:06:38 crc kubenswrapper[4778]: W0318 09:06:38.461955 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b7607d_fa16_45c1_a0cb_c5ec39a288fb.slice/crio-fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267 WatchSource:0}: Error finding container fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267: Status 404 returned error can't find the container with id fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267 Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.481131 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.481779 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.488671 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.583627 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.586270 4778 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.586312 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.608803 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nkbq4\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.763793 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.765275 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.770425 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.770958 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.771082 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.771275 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.772656 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.774668 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.779274 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.780034 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.789899 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.789954 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.789986 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5z6\" (UniqueName: \"kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.790097 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.790131 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.882334 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.892251 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.892327 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.892375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.892409 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.892489 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5z6\" (UniqueName: \"kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.894815 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.899111 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.900363 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.920756 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:38 crc kubenswrapper[4778]: I0318 09:06:38.933465 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5z6\" (UniqueName: \"kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6\") pod \"controller-manager-8bc989ddd-wh99s\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.074996 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.077118 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.081217 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.107937 4778 generic.go:334] "Generic (PLEG): container finished" podID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerID="e143a776ed51bb64025b24b3e1cc128e2a2ca67730b9a34f438ed6857f8be065" exitCode=0 Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.108048 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerDied","Data":"e143a776ed51bb64025b24b3e1cc128e2a2ca67730b9a34f438ed6857f8be065"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.116143 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" event={"ID":"b379a820-627d-403c-b50b-b6fbea94aa65","Type":"ContainerStarted","Data":"c52ef847baaab8c966dfa357e5150a1446e9ecf64dd0cf070d104b1c9769de57"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.116309 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" event={"ID":"b379a820-627d-403c-b50b-b6fbea94aa65","Type":"ContainerStarted","Data":"6e0b29785a4e6e0a0788e6fe250c71b8bb1b8d6f56dea6368383dd453f7f7456"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.116743 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.120874 4778 generic.go:334] "Generic (PLEG): container finished" podID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerID="9e9b1baa8deb4596f595ec2a830346f2addf7d69c909efa6643ba0c90cdd01c7" exitCode=0 Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.121054 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerDied","Data":"9e9b1baa8deb4596f595ec2a830346f2addf7d69c909efa6643ba0c90cdd01c7"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.121106 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerStarted","Data":"5c2b4b3fdbc29641b0fd4b628d894c39c32fe66c2451fed6064a04a8d6f0eddd"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.133781 4778 generic.go:334] "Generic (PLEG): container finished" podID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerID="33f31c9dc67a1137fd25116299f097d08a1ad1b4a3924bc1eb5ff8d0db0c9727" exitCode=0 Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.133927 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerDied","Data":"33f31c9dc67a1137fd25116299f097d08a1ad1b4a3924bc1eb5ff8d0db0c9727"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.134002 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerStarted","Data":"f49ddb65c41adfb45d65de1198b0f82574583119a6d4929d1ffa55ce9f770dcc"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.142285 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.151319 4778 generic.go:334] "Generic (PLEG): container finished" podID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerID="44d1d0b1ecaf0bd45db18a8ca3c0502c00748ea75b870a51131c12eecf1aa1f8" exitCode=0 Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.151395 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerDied","Data":"44d1d0b1ecaf0bd45db18a8ca3c0502c00748ea75b870a51131c12eecf1aa1f8"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.151423 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerStarted","Data":"fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.159343 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" podStartSLOduration=2.159328988 podStartE2EDuration="2.159328988s" podCreationTimestamp="2026-03-18 09:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:39.157249009 +0000 UTC m=+265.731993879" watchObservedRunningTime="2026-03-18 09:06:39.159328988 +0000 UTC m=+265.734073848" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.162682 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.167841 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" event={"ID":"bead92a8-42de-4171-9c0c-790d64a6d14a","Type":"ContainerStarted","Data":"274b846862188c4e07460e1a5f36c4313d3d9dfe5281291dc27aa0c023580054"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.167902 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" event={"ID":"bead92a8-42de-4171-9c0c-790d64a6d14a","Type":"ContainerStarted","Data":"da66d81f59ca90001e0fc3ec7bd56b07c7dfea1eaff294030a6c458e89f889d6"} Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.188075 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.196315 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf8tc\" (UniqueName: \"kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.196364 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.196479 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.206688 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5gdpq" podStartSLOduration=12.206670899 podStartE2EDuration="12.206670899s" podCreationTimestamp="2026-03-18 09:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:39.200983308 +0000 UTC m=+265.775728138" watchObservedRunningTime="2026-03-18 09:06:39.206670899 +0000 UTC m=+265.781415729" Mar 18 09:06:39 crc kubenswrapper[4778]: W0318 09:06:39.223113 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2faf8fa8_d474_4c7d_8566_8abc58d7d5ad.slice/crio-c662bfec2aa4a3db7809425b04cf847a1727f28e0c070e4d7298a20004e2533f WatchSource:0}: Error finding container c662bfec2aa4a3db7809425b04cf847a1727f28e0c070e4d7298a20004e2533f: Status 404 returned error can't find the container with id c662bfec2aa4a3db7809425b04cf847a1727f28e0c070e4d7298a20004e2533f Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.248072 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.258958 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:39 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:39 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:39 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.259026 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.300150 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf8tc\" (UniqueName: \"kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.300242 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.300419 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.300736 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.300856 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.343492 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf8tc\" (UniqueName: \"kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc\") pod \"redhat-marketplace-6qgm2\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.375122 4778 ???:1] "http: TLS handshake error from 192.168.126.11:51792: no serving certificate available for the kubelet" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.401695 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.473382 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.480713 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.481977 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: W0318 09:06:39.495504 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2110837_0c54_448f_8b94_68bdea470d14.slice/crio-1f4f69aa812ccf6e175392ab8a7bd2a35678566f0ef2ecd7701be29d6dc1d5ce WatchSource:0}: Error finding container 1f4f69aa812ccf6e175392ab8a7bd2a35678566f0ef2ecd7701be29d6dc1d5ce: Status 404 returned error can't find the container with id 1f4f69aa812ccf6e175392ab8a7bd2a35678566f0ef2ecd7701be29d6dc1d5ce Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.497003 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.506792 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnfhp\" (UniqueName: \"kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.506881 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.506923 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.607903 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.607983 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.608033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnfhp\" (UniqueName: \"kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.609105 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.609354 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.628812 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnfhp\" (UniqueName: \"kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp\") pod \"redhat-marketplace-rscg9\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.777822 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.778532 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.782615 4778 patch_prober.go:28] interesting pod/console-f9d7485db-pgsqh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.782698 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pgsqh" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.819845 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.855601 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.857606 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.858989 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:39 crc kubenswrapper[4778]: W0318 09:06:39.865293 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57f9c6f6_c20e_4e28_aec4_f0104ddb2b47.slice/crio-a8ae2f2fdacacfb271851ed06f03f49aedc9a4cd93513577d22ad1180d7345ef WatchSource:0}: Error finding container a8ae2f2fdacacfb271851ed06f03f49aedc9a4cd93513577d22ad1180d7345ef: Status 404 returned error can't find the container with id a8ae2f2fdacacfb271851ed06f03f49aedc9a4cd93513577d22ad1180d7345ef Mar 18 09:06:39 crc kubenswrapper[4778]: I0318 09:06:39.872626 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.087495 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.089045 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.101239 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.103230 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.103266 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.103669 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.107093 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.107148 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.123610 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp6w4\" (UniqueName: \"kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.123703 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.123744 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.153043 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:06:40 crc kubenswrapper[4778]: W0318 09:06:40.172116 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aedbf59_d23d_409e_9742_09824ed6ef2a.slice/crio-535b754da1c090611910ad3734bf58ec9a371b36592f999639d434308dca9ac6 WatchSource:0}: Error finding container 535b754da1c090611910ad3734bf58ec9a371b36592f999639d434308dca9ac6: Status 404 returned error can't find the container with id 535b754da1c090611910ad3734bf58ec9a371b36592f999639d434308dca9ac6 Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.187375 4778 generic.go:334] "Generic (PLEG): container finished" podID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerID="113dc27ffd2ebd355aaf8e22c8a148444f799a56c796af33fbc9fe643673da94" exitCode=0 Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.227409 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.227493 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp6w4\" (UniqueName: \"kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.227580 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.228175 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.228727 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.234701 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.247398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerDied","Data":"113dc27ffd2ebd355aaf8e22c8a148444f799a56c796af33fbc9fe643673da94"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.247443 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerStarted","Data":"a8ae2f2fdacacfb271851ed06f03f49aedc9a4cd93513577d22ad1180d7345ef"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.247459 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" event={"ID":"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad","Type":"ContainerStarted","Data":"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.247477 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" event={"ID":"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad","Type":"ContainerStarted","Data":"c662bfec2aa4a3db7809425b04cf847a1727f28e0c070e4d7298a20004e2533f"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.247494 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.257109 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.263798 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp6w4\" (UniqueName: \"kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4\") pod \"redhat-operators-6kvnk\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.291512 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:40 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:40 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:40 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.292036 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.302387 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" event={"ID":"c2110837-0c54-448f-8b94-68bdea470d14","Type":"ContainerStarted","Data":"f7325e2f133bd8a9b84c58690841dede1fe1ff86d10cd0f0298df6312d40064a"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.303014 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.303048 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" event={"ID":"c2110837-0c54-448f-8b94-68bdea470d14","Type":"ContainerStarted","Data":"1f4f69aa812ccf6e175392ab8a7bd2a35678566f0ef2ecd7701be29d6dc1d5ce"} Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.312495 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" podStartSLOduration=196.312473399 podStartE2EDuration="3m16.312473399s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:40.301662243 +0000 UTC m=+266.876407083" watchObservedRunningTime="2026-03-18 09:06:40.312473399 +0000 UTC m=+266.887218239" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.330409 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ckp9s" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.346507 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.377435 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" podStartSLOduration=3.377412639 podStartE2EDuration="3.377412639s" podCreationTimestamp="2026-03-18 09:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:40.351555717 +0000 UTC m=+266.926300557" watchObservedRunningTime="2026-03-18 09:06:40.377412639 +0000 UTC m=+266.952157479" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.479161 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.515278 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.516553 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.521001 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.538501 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.538854 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.574301 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.575707 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.588743 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.591976 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.592859 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.614182 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.614713 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.616240 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.641971 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.642056 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.642083 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.647801 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlv42\" (UniqueName: \"kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.648093 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750019 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750499 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750532 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750567 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750664 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.750722 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlv42\" (UniqueName: \"kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.751068 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.751899 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.751988 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.777848 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.779292 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlv42\" (UniqueName: \"kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42\") pod \"redhat-operators-t74gc\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.853624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.853686 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.854013 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.908841 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.922392 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:06:40 crc kubenswrapper[4778]: I0318 09:06:40.974556 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.025468 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.063575 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:06:41 crc kubenswrapper[4778]: W0318 09:06:41.211626 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod938982a6_57b0_4870_abed_a98c42196ae6.slice/crio-40f7240695ce3bda0e1e4c2a6151b9a381bd8173f92f88c530036fbcfc0a002f WatchSource:0}: Error finding container 40f7240695ce3bda0e1e4c2a6151b9a381bd8173f92f88c530036fbcfc0a002f: Status 404 returned error can't find the container with id 40f7240695ce3bda0e1e4c2a6151b9a381bd8173f92f88c530036fbcfc0a002f Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.263031 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:41 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:41 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:41 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.263102 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.324007 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerStarted","Data":"40f7240695ce3bda0e1e4c2a6151b9a381bd8173f92f88c530036fbcfc0a002f"} Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.345048 4778 generic.go:334] "Generic (PLEG): container finished" podID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerID="1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58" exitCode=0 Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.346701 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerDied","Data":"1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58"} Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.346749 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerStarted","Data":"535b754da1c090611910ad3734bf58ec9a371b36592f999639d434308dca9ac6"} Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.570429 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 09:06:41 crc kubenswrapper[4778]: W0318 09:06:41.688932 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda7a7245b_196d_4cea_916b_858e30dcc936.slice/crio-7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3 WatchSource:0}: Error finding container 7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3: Status 404 returned error can't find the container with id 7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3 Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.726689 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:06:41 crc kubenswrapper[4778]: W0318 09:06:41.764165 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01828fdf_ef1b_44e3_905b_aec0c6aaa44f.slice/crio-3806b4b94100c7a5155593ac6313aa0ecd333d565872aab11d373225ceac8468 WatchSource:0}: Error finding container 3806b4b94100c7a5155593ac6313aa0ecd333d565872aab11d373225ceac8468: Status 404 returned error can't find the container with id 3806b4b94100c7a5155593ac6313aa0ecd333d565872aab11d373225ceac8468 Mar 18 09:06:41 crc kubenswrapper[4778]: I0318 09:06:41.831587 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.266175 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:42 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:42 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:42 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.266560 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.383027 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"60b8330d-375e-49a4-948b-0aaad227e09e","Type":"ContainerStarted","Data":"5a32dc2a00f35187349fa55da1e46dd7d020a4f5a521c2ae089869b21c0783e4"} Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.426755 4778 generic.go:334] "Generic (PLEG): container finished" podID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerID="041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118" exitCode=0 Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.426894 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerDied","Data":"041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118"} Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.426926 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerStarted","Data":"3806b4b94100c7a5155593ac6313aa0ecd333d565872aab11d373225ceac8468"} Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.481179 4778 generic.go:334] "Generic (PLEG): container finished" podID="938982a6-57b0-4870-abed-a98c42196ae6" containerID="8977456d128ab832e4d2b65a1ebbe275173e48c92b3849c579d8a9cc853d0ce8" exitCode=0 Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.481871 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerDied","Data":"8977456d128ab832e4d2b65a1ebbe275173e48c92b3849c579d8a9cc853d0ce8"} Mar 18 09:06:42 crc kubenswrapper[4778]: I0318 09:06:42.494781 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a7a7245b-196d-4cea-916b-858e30dcc936","Type":"ContainerStarted","Data":"7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3"} Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.271179 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:43 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:43 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:43 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.271248 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.515025 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a7a7245b-196d-4cea-916b-858e30dcc936","Type":"ContainerStarted","Data":"92309743aa161b0e0c5404c8814af46d06b54c2a29edec78dc603167720d6d87"} Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.520563 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"60b8330d-375e-49a4-948b-0aaad227e09e","Type":"ContainerStarted","Data":"5dc6403b3a8eff2a47e98ebf150f40bbe6b37ddfccac40d625f7e4f878779c41"} Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.555453 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.555433888 podStartE2EDuration="3.555433888s" podCreationTimestamp="2026-03-18 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:43.538090495 +0000 UTC m=+270.112835355" watchObservedRunningTime="2026-03-18 09:06:43.555433888 +0000 UTC m=+270.130178728" Mar 18 09:06:43 crc kubenswrapper[4778]: I0318 09:06:43.556467 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.556459636 podStartE2EDuration="3.556459636s" podCreationTimestamp="2026-03-18 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:06:43.55342159 +0000 UTC m=+270.128166430" watchObservedRunningTime="2026-03-18 09:06:43.556459636 +0000 UTC m=+270.131204466" Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.258150 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:44 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:44 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:44 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.258772 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.531097 4778 ???:1] "http: TLS handshake error from 192.168.126.11:54362: no serving certificate available for the kubelet" Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.545067 4778 generic.go:334] "Generic (PLEG): container finished" podID="60b8330d-375e-49a4-948b-0aaad227e09e" containerID="5dc6403b3a8eff2a47e98ebf150f40bbe6b37ddfccac40d625f7e4f878779c41" exitCode=0 Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.545153 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"60b8330d-375e-49a4-948b-0aaad227e09e","Type":"ContainerDied","Data":"5dc6403b3a8eff2a47e98ebf150f40bbe6b37ddfccac40d625f7e4f878779c41"} Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.558963 4778 generic.go:334] "Generic (PLEG): container finished" podID="a7a7245b-196d-4cea-916b-858e30dcc936" containerID="92309743aa161b0e0c5404c8814af46d06b54c2a29edec78dc603167720d6d87" exitCode=0 Mar 18 09:06:44 crc kubenswrapper[4778]: I0318 09:06:44.559054 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a7a7245b-196d-4cea-916b-858e30dcc936","Type":"ContainerDied","Data":"92309743aa161b0e0c5404c8814af46d06b54c2a29edec78dc603167720d6d87"} Mar 18 09:06:45 crc kubenswrapper[4778]: I0318 09:06:45.256102 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:45 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:45 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:45 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:45 crc kubenswrapper[4778]: I0318 09:06:45.256169 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:45 crc kubenswrapper[4778]: I0318 09:06:45.860218 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wgbcp" Mar 18 09:06:46 crc kubenswrapper[4778]: I0318 09:06:46.256122 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:46 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:46 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:46 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:46 crc kubenswrapper[4778]: I0318 09:06:46.256181 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:47 crc kubenswrapper[4778]: I0318 09:06:47.257097 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:47 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:47 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:47 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:47 crc kubenswrapper[4778]: I0318 09:06:47.257151 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:47 crc kubenswrapper[4778]: I0318 09:06:47.451243 4778 ???:1] "http: TLS handshake error from 192.168.126.11:54372: no serving certificate available for the kubelet" Mar 18 09:06:48 crc kubenswrapper[4778]: I0318 09:06:48.255890 4778 patch_prober.go:28] interesting pod/router-default-5444994796-nnfvg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 09:06:48 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Mar 18 09:06:48 crc kubenswrapper[4778]: [+]process-running ok Mar 18 09:06:48 crc kubenswrapper[4778]: healthz check failed Mar 18 09:06:48 crc kubenswrapper[4778]: I0318 09:06:48.256187 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nnfvg" podUID="9b31b04d-28d1-4397-88b3-b26a4bb6ede9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 09:06:49 crc kubenswrapper[4778]: I0318 09:06:49.256390 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:49 crc kubenswrapper[4778]: I0318 09:06:49.260909 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-nnfvg" Mar 18 09:06:49 crc kubenswrapper[4778]: I0318 09:06:49.778497 4778 patch_prober.go:28] interesting pod/console-f9d7485db-pgsqh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 18 09:06:49 crc kubenswrapper[4778]: I0318 09:06:49.778587 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pgsqh" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 18 09:06:50 crc kubenswrapper[4778]: I0318 09:06:50.103037 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:50 crc kubenswrapper[4778]: I0318 09:06:50.103607 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:50 crc kubenswrapper[4778]: I0318 09:06:50.103470 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-tnw27 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 18 09:06:50 crc kubenswrapper[4778]: I0318 09:06:50.103726 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tnw27" podUID="08b964cf-bfc5-4b90-83a3-0b358c3ffbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 18 09:06:54 crc kubenswrapper[4778]: I0318 09:06:54.796846 4778 ???:1] "http: TLS handshake error from 192.168.126.11:58340: no serving certificate available for the kubelet" Mar 18 09:06:55 crc kubenswrapper[4778]: I0318 09:06:55.655992 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:06:55 crc kubenswrapper[4778]: I0318 09:06:55.656256 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" podUID="c2110837-0c54-448f-8b94-68bdea470d14" containerName="controller-manager" containerID="cri-o://f7325e2f133bd8a9b84c58690841dede1fe1ff86d10cd0f0298df6312d40064a" gracePeriod=30 Mar 18 09:06:55 crc kubenswrapper[4778]: I0318 09:06:55.671040 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:06:55 crc kubenswrapper[4778]: I0318 09:06:55.671355 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" containerName="route-controller-manager" containerID="cri-o://c52ef847baaab8c966dfa357e5150a1446e9ecf64dd0cf070d104b1c9769de57" gracePeriod=30 Mar 18 09:06:56 crc kubenswrapper[4778]: I0318 09:06:56.766809 4778 generic.go:334] "Generic (PLEG): container finished" podID="c2110837-0c54-448f-8b94-68bdea470d14" containerID="f7325e2f133bd8a9b84c58690841dede1fe1ff86d10cd0f0298df6312d40064a" exitCode=0 Mar 18 09:06:56 crc kubenswrapper[4778]: I0318 09:06:56.766962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" event={"ID":"c2110837-0c54-448f-8b94-68bdea470d14","Type":"ContainerDied","Data":"f7325e2f133bd8a9b84c58690841dede1fe1ff86d10cd0f0298df6312d40064a"} Mar 18 09:06:58 crc kubenswrapper[4778]: I0318 09:06:58.150269 4778 patch_prober.go:28] interesting pod/route-controller-manager-67677f775c-zxrmx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 18 09:06:58 crc kubenswrapper[4778]: I0318 09:06:58.150385 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 18 09:06:58 crc kubenswrapper[4778]: I0318 09:06:58.889431 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:06:59 crc kubenswrapper[4778]: I0318 09:06:59.163727 4778 patch_prober.go:28] interesting pod/controller-manager-8bc989ddd-wh99s container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 18 09:06:59 crc kubenswrapper[4778]: I0318 09:06:59.163796 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" podUID="c2110837-0c54-448f-8b94-68bdea470d14" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 18 09:06:59 crc kubenswrapper[4778]: I0318 09:06:59.790038 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:06:59 crc kubenswrapper[4778]: I0318 09:06:59.799184 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:07:00 crc kubenswrapper[4778]: I0318 09:07:00.118018 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-tnw27" Mar 18 09:07:00 crc kubenswrapper[4778]: I0318 09:07:00.147798 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:07:00 crc kubenswrapper[4778]: I0318 09:07:00.147946 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.044382 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.095524 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir\") pod \"a7a7245b-196d-4cea-916b-858e30dcc936\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.095653 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access\") pod \"a7a7245b-196d-4cea-916b-858e30dcc936\" (UID: \"a7a7245b-196d-4cea-916b-858e30dcc936\") " Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.095773 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a7a7245b-196d-4cea-916b-858e30dcc936" (UID: "a7a7245b-196d-4cea-916b-858e30dcc936"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.096245 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7a7245b-196d-4cea-916b-858e30dcc936-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.103948 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a7a7245b-196d-4cea-916b-858e30dcc936" (UID: "a7a7245b-196d-4cea-916b-858e30dcc936"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.197266 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7a7245b-196d-4cea-916b-858e30dcc936-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.802331 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.802376 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a7a7245b-196d-4cea-916b-858e30dcc936","Type":"ContainerDied","Data":"7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3"} Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.802832 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ea6f7a6e8b0a8ff5b75d5e6b844aba55abc48251e304d4a97f397a66a9b6cb3" Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.805980 4778 generic.go:334] "Generic (PLEG): container finished" podID="b379a820-627d-403c-b50b-b6fbea94aa65" containerID="c52ef847baaab8c966dfa357e5150a1446e9ecf64dd0cf070d104b1c9769de57" exitCode=0 Mar 18 09:07:01 crc kubenswrapper[4778]: I0318 09:07:01.806028 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" event={"ID":"b379a820-627d-403c-b50b-b6fbea94aa65","Type":"ContainerDied","Data":"c52ef847baaab8c966dfa357e5150a1446e9ecf64dd0cf070d104b1c9769de57"} Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.112608 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.239093 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir\") pod \"60b8330d-375e-49a4-948b-0aaad227e09e\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.239237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "60b8330d-375e-49a4-948b-0aaad227e09e" (UID: "60b8330d-375e-49a4-948b-0aaad227e09e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.239249 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access\") pod \"60b8330d-375e-49a4-948b-0aaad227e09e\" (UID: \"60b8330d-375e-49a4-948b-0aaad227e09e\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.239743 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60b8330d-375e-49a4-948b-0aaad227e09e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.251536 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "60b8330d-375e-49a4-948b-0aaad227e09e" (UID: "60b8330d-375e-49a4-948b-0aaad227e09e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.341816 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60b8330d-375e-49a4-948b-0aaad227e09e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.497450 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.506725 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb5z6\" (UniqueName: \"kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6\") pod \"c2110837-0c54-448f-8b94-68bdea470d14\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544101 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d5kq\" (UniqueName: \"kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq\") pod \"b379a820-627d-403c-b50b-b6fbea94aa65\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544315 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert\") pod \"b379a820-627d-403c-b50b-b6fbea94aa65\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544357 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca\") pod \"b379a820-627d-403c-b50b-b6fbea94aa65\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544402 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert\") pod \"c2110837-0c54-448f-8b94-68bdea470d14\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544428 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles\") pod \"c2110837-0c54-448f-8b94-68bdea470d14\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544589 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config\") pod \"c2110837-0c54-448f-8b94-68bdea470d14\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544620 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config\") pod \"b379a820-627d-403c-b50b-b6fbea94aa65\" (UID: \"b379a820-627d-403c-b50b-b6fbea94aa65\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.544650 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca\") pod \"c2110837-0c54-448f-8b94-68bdea470d14\" (UID: \"c2110837-0c54-448f-8b94-68bdea470d14\") " Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.546948 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca" (OuterVolumeSpecName: "client-ca") pod "c2110837-0c54-448f-8b94-68bdea470d14" (UID: "c2110837-0c54-448f-8b94-68bdea470d14"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.547188 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config" (OuterVolumeSpecName: "config") pod "b379a820-627d-403c-b50b-b6fbea94aa65" (UID: "b379a820-627d-403c-b50b-b6fbea94aa65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.547675 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c2110837-0c54-448f-8b94-68bdea470d14" (UID: "c2110837-0c54-448f-8b94-68bdea470d14"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.547700 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config" (OuterVolumeSpecName: "config") pod "c2110837-0c54-448f-8b94-68bdea470d14" (UID: "c2110837-0c54-448f-8b94-68bdea470d14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.547705 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca" (OuterVolumeSpecName: "client-ca") pod "b379a820-627d-403c-b50b-b6fbea94aa65" (UID: "b379a820-627d-403c-b50b-b6fbea94aa65"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.549622 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b379a820-627d-403c-b50b-b6fbea94aa65" (UID: "b379a820-627d-403c-b50b-b6fbea94aa65"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.549658 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c2110837-0c54-448f-8b94-68bdea470d14" (UID: "c2110837-0c54-448f-8b94-68bdea470d14"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.550788 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq" (OuterVolumeSpecName: "kube-api-access-6d5kq") pod "b379a820-627d-403c-b50b-b6fbea94aa65" (UID: "b379a820-627d-403c-b50b-b6fbea94aa65"). InnerVolumeSpecName "kube-api-access-6d5kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.552365 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6" (OuterVolumeSpecName: "kube-api-access-xb5z6") pod "c2110837-0c54-448f-8b94-68bdea470d14" (UID: "c2110837-0c54-448f-8b94-68bdea470d14"). InnerVolumeSpecName "kube-api-access-xb5z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.646914 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb5z6\" (UniqueName: \"kubernetes.io/projected/c2110837-0c54-448f-8b94-68bdea470d14-kube-api-access-xb5z6\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.646963 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d5kq\" (UniqueName: \"kubernetes.io/projected/b379a820-627d-403c-b50b-b6fbea94aa65-kube-api-access-6d5kq\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.646978 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b379a820-627d-403c-b50b-b6fbea94aa65-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.646992 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.647008 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2110837-0c54-448f-8b94-68bdea470d14-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.647024 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.647037 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.647050 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b379a820-627d-403c-b50b-b6fbea94aa65-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.647061 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2110837-0c54-448f-8b94-68bdea470d14-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.826015 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" event={"ID":"c2110837-0c54-448f-8b94-68bdea470d14","Type":"ContainerDied","Data":"1f4f69aa812ccf6e175392ab8a7bd2a35678566f0ef2ecd7701be29d6dc1d5ce"} Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.826073 4778 scope.go:117] "RemoveContainer" containerID="f7325e2f133bd8a9b84c58690841dede1fe1ff86d10cd0f0298df6312d40064a" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.826073 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc989ddd-wh99s" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.827539 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.827531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx" event={"ID":"b379a820-627d-403c-b50b-b6fbea94aa65","Type":"ContainerDied","Data":"6e0b29785a4e6e0a0788e6fe250c71b8bb1b8d6f56dea6368383dd453f7f7456"} Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.829069 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"60b8330d-375e-49a4-948b-0aaad227e09e","Type":"ContainerDied","Data":"5a32dc2a00f35187349fa55da1e46dd7d020a4f5a521c2ae089869b21c0783e4"} Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.829099 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a32dc2a00f35187349fa55da1e46dd7d020a4f5a521c2ae089869b21c0783e4" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.829150 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.864359 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.868396 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67677f775c-zxrmx"] Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.883224 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:07:04 crc kubenswrapper[4778]: I0318 09:07:04.886471 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8bc989ddd-wh99s"] Mar 18 09:07:05 crc kubenswrapper[4778]: E0318 09:07:05.563955 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 09:07:05 crc kubenswrapper[4778]: E0318 09:07:05.564488 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:07:05 crc kubenswrapper[4778]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 09:07:05 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-55w8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29563746-b66f7_openshift-infra(c3be356e-94af-47db-a182-dd8a57024619): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 09:07:05 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:07:05 crc kubenswrapper[4778]: E0318 09:07:05.565631 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29563746-b66f7" podUID="c3be356e-94af-47db-a182-dd8a57024619" Mar 18 09:07:05 crc kubenswrapper[4778]: E0318 09:07:05.835707 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29563746-b66f7" podUID="c3be356e-94af-47db-a182-dd8a57024619" Mar 18 09:07:06 crc kubenswrapper[4778]: I0318 09:07:06.195515 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" path="/var/lib/kubelet/pods/b379a820-627d-403c-b50b-b6fbea94aa65/volumes" Mar 18 09:07:06 crc kubenswrapper[4778]: I0318 09:07:06.196150 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2110837-0c54-448f-8b94-68bdea470d14" path="/var/lib/kubelet/pods/c2110837-0c54-448f-8b94-68bdea470d14/volumes" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.372161 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.373011 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 09:07:08 crc kubenswrapper[4778]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 09:07:08 crc kubenswrapper[4778]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zjwps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29563744-btdt7_openshift-infra(54961f10-93b0-433f-8a7d-b30d69178e9a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 09:07:08 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.374134 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29563744-btdt7" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801337 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.801813 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" containerName="route-controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801825 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" containerName="route-controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.801847 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b8330d-375e-49a4-948b-0aaad227e09e" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801853 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b8330d-375e-49a4-948b-0aaad227e09e" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.801865 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2110837-0c54-448f-8b94-68bdea470d14" containerName="controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801872 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2110837-0c54-448f-8b94-68bdea470d14" containerName="controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.801880 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a7245b-196d-4cea-916b-858e30dcc936" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801885 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a7245b-196d-4cea-916b-858e30dcc936" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801982 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a7245b-196d-4cea-916b-858e30dcc936" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.801994 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b379a820-627d-403c-b50b-b6fbea94aa65" containerName="route-controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.802001 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b8330d-375e-49a4-948b-0aaad227e09e" containerName="pruner" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.802008 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2110837-0c54-448f-8b94-68bdea470d14" containerName="controller-manager" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.802451 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.810469 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.810685 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.810670 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.810879 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.812362 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816544 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816632 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv4jr\" (UniqueName: \"kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816692 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.816718 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rrfs\" (UniqueName: \"kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.817653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.817699 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.817775 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.818266 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.818434 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.818435 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.818458 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.821861 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.822081 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.822327 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.822566 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.823561 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.824366 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.825817 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.828596 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:08 crc kubenswrapper[4778]: E0318 09:07:08.856630 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29563744-btdt7" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.918975 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919395 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919535 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919650 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv4jr\" (UniqueName: \"kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919738 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919857 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.919942 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rrfs\" (UniqueName: \"kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.920031 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.920122 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.921176 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.921595 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.922577 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.922580 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.929718 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.930250 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.934701 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.935826 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rrfs\" (UniqueName: \"kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs\") pod \"controller-manager-56dc87ff66-nlfnr\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:08 crc kubenswrapper[4778]: I0318 09:07:08.936291 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv4jr\" (UniqueName: \"kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr\") pod \"route-controller-manager-659fff47c9-56v2w\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:09 crc kubenswrapper[4778]: I0318 09:07:09.130108 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:09 crc kubenswrapper[4778]: I0318 09:07:09.144219 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:10 crc kubenswrapper[4778]: I0318 09:07:10.517283 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vgscx" Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.633988 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.725836 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.851279 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.852639 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.856091 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.856279 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.860534 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.937792 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:15 crc kubenswrapper[4778]: I0318 09:07:15.937838 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:16 crc kubenswrapper[4778]: I0318 09:07:16.039219 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:16 crc kubenswrapper[4778]: I0318 09:07:16.039276 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:16 crc kubenswrapper[4778]: I0318 09:07:16.039510 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:16 crc kubenswrapper[4778]: I0318 09:07:16.066254 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:16 crc kubenswrapper[4778]: I0318 09:07:16.171407 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:17 crc kubenswrapper[4778]: E0318 09:07:17.228943 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 09:07:17 crc kubenswrapper[4778]: E0318 09:07:17.229127 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fp6w4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-6kvnk_openshift-marketplace(938982a6-57b0-4870-abed-a98c42196ae6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 09:07:17 crc kubenswrapper[4778]: E0318 09:07:17.231334 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-6kvnk" podUID="938982a6-57b0-4870-abed-a98c42196ae6" Mar 18 09:07:18 crc kubenswrapper[4778]: E0318 09:07:18.944649 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-6kvnk" podUID="938982a6-57b0-4870-abed-a98c42196ae6" Mar 18 09:07:18 crc kubenswrapper[4778]: I0318 09:07:18.952282 4778 scope.go:117] "RemoveContainer" containerID="c52ef847baaab8c966dfa357e5150a1446e9ecf64dd0cf070d104b1c9769de57" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.045908 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.046103 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gnfhp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rscg9_openshift-marketplace(3aedbf59-d23d-409e-9742-09824ed6ef2a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.047426 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rscg9" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.085620 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.085775 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qf8tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6qgm2_openshift-marketplace(57f9c6f6-c20e-4e28-aec4-f0104ddb2b47): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.087472 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6qgm2" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.160014 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.160255 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlv42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-t74gc_openshift-marketplace(01828fdf-ef1b-44e3-905b-aec0c6aaa44f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.162370 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-t74gc" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.550180 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.609449 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 09:07:19 crc kubenswrapper[4778]: W0318 09:07:19.617264 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4c5110c4_3fed_4837_b17c_6578b2034f13.slice/crio-e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4 WatchSource:0}: Error finding container e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4: Status 404 returned error can't find the container with id e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4 Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.626553 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:19 crc kubenswrapper[4778]: W0318 09:07:19.630628 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4345a66_5037_444e_a1e8_c16f21fbdaca.slice/crio-7757ee38844688929de1c12404b0d4e708b8a04f3d6ee943e8d293f97184928f WatchSource:0}: Error finding container 7757ee38844688929de1c12404b0d4e708b8a04f3d6ee943e8d293f97184928f: Status 404 returned error can't find the container with id 7757ee38844688929de1c12404b0d4e708b8a04f3d6ee943e8d293f97184928f Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.919886 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerStarted","Data":"db7b16ab120c184db5ebadc2caf608fa9242b9a332050072ad2cae2fba3722b7"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.922051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerStarted","Data":"8b9319e52264a71946e18a681c32dbd6ffc04e6afcc03a59b8bfa719c7422b7f"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.923337 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4c5110c4-3fed-4837-b17c-6578b2034f13","Type":"ContainerStarted","Data":"e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.927879 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerStarted","Data":"206c2187bf0a136c6ff49b69bb1bb6cc918dc7e2a3d9bd6a2d8bac6ce3a51e5f"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.929209 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" event={"ID":"a4345a66-5037-444e-a1e8-c16f21fbdaca","Type":"ContainerStarted","Data":"7757ee38844688929de1c12404b0d4e708b8a04f3d6ee943e8d293f97184928f"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.931911 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerStarted","Data":"080f512015bd3cc96010f06c24c5ff7c172a932d2f9c91b8b4e05d0e6fdb8776"} Mar 18 09:07:19 crc kubenswrapper[4778]: I0318 09:07:19.932999 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" event={"ID":"b6d27b9b-6d87-4aa8-abee-5d0323e96304","Type":"ContainerStarted","Data":"a122f5ee8353a3e56d947f8d425d42ac2f3e6f348d5fc80375524cfbc8e649c9"} Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.934397 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6qgm2" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.934730 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-t74gc" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" Mar 18 09:07:19 crc kubenswrapper[4778]: E0318 09:07:19.935055 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rscg9" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.942578 4778 generic.go:334] "Generic (PLEG): container finished" podID="4c5110c4-3fed-4837-b17c-6578b2034f13" containerID="59e97b61a0f05736ea90d2008ef8d588313cdc9a879cf789dff3d41032af56db" exitCode=0 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.942677 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4c5110c4-3fed-4837-b17c-6578b2034f13","Type":"ContainerDied","Data":"59e97b61a0f05736ea90d2008ef8d588313cdc9a879cf789dff3d41032af56db"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.946546 4778 generic.go:334] "Generic (PLEG): container finished" podID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerID="080f512015bd3cc96010f06c24c5ff7c172a932d2f9c91b8b4e05d0e6fdb8776" exitCode=0 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.946605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerDied","Data":"080f512015bd3cc96010f06c24c5ff7c172a932d2f9c91b8b4e05d0e6fdb8776"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.948480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" event={"ID":"b6d27b9b-6d87-4aa8-abee-5d0323e96304","Type":"ContainerStarted","Data":"9dc0270e3833a9e7f231ec92032b5254e8de6e37228c467052e88b73ed8dd1ef"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.948752 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" podUID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" containerName="controller-manager" containerID="cri-o://9dc0270e3833a9e7f231ec92032b5254e8de6e37228c467052e88b73ed8dd1ef" gracePeriod=30 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.948953 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.951037 4778 generic.go:334] "Generic (PLEG): container finished" podID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerID="206c2187bf0a136c6ff49b69bb1bb6cc918dc7e2a3d9bd6a2d8bac6ce3a51e5f" exitCode=0 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.951087 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerDied","Data":"206c2187bf0a136c6ff49b69bb1bb6cc918dc7e2a3d9bd6a2d8bac6ce3a51e5f"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.954477 4778 generic.go:334] "Generic (PLEG): container finished" podID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerID="db7b16ab120c184db5ebadc2caf608fa9242b9a332050072ad2cae2fba3722b7" exitCode=0 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.954544 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerDied","Data":"db7b16ab120c184db5ebadc2caf608fa9242b9a332050072ad2cae2fba3722b7"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.957462 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.960059 4778 generic.go:334] "Generic (PLEG): container finished" podID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerID="8b9319e52264a71946e18a681c32dbd6ffc04e6afcc03a59b8bfa719c7422b7f" exitCode=0 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.960142 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerDied","Data":"8b9319e52264a71946e18a681c32dbd6ffc04e6afcc03a59b8bfa719c7422b7f"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.966495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" event={"ID":"a4345a66-5037-444e-a1e8-c16f21fbdaca","Type":"ContainerStarted","Data":"80caeba679f5d418af7acb193febae34c30e2113a5b3eda92aafc367f3b5c972"} Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.966677 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" podUID="a4345a66-5037-444e-a1e8-c16f21fbdaca" containerName="route-controller-manager" containerID="cri-o://80caeba679f5d418af7acb193febae34c30e2113a5b3eda92aafc367f3b5c972" gracePeriod=30 Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.966923 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:20 crc kubenswrapper[4778]: I0318 09:07:20.975232 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.070957 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" podStartSLOduration=26.070942226 podStartE2EDuration="26.070942226s" podCreationTimestamp="2026-03-18 09:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:21.067958391 +0000 UTC m=+307.642703231" watchObservedRunningTime="2026-03-18 09:07:21.070942226 +0000 UTC m=+307.645687066" Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.088821 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" podStartSLOduration=26.088804542 podStartE2EDuration="26.088804542s" podCreationTimestamp="2026-03-18 09:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:21.086086635 +0000 UTC m=+307.660831485" watchObservedRunningTime="2026-03-18 09:07:21.088804542 +0000 UTC m=+307.663549382" Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.974415 4778 generic.go:334] "Generic (PLEG): container finished" podID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" containerID="9dc0270e3833a9e7f231ec92032b5254e8de6e37228c467052e88b73ed8dd1ef" exitCode=0 Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.974527 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" event={"ID":"b6d27b9b-6d87-4aa8-abee-5d0323e96304","Type":"ContainerDied","Data":"9dc0270e3833a9e7f231ec92032b5254e8de6e37228c467052e88b73ed8dd1ef"} Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.976905 4778 generic.go:334] "Generic (PLEG): container finished" podID="a4345a66-5037-444e-a1e8-c16f21fbdaca" containerID="80caeba679f5d418af7acb193febae34c30e2113a5b3eda92aafc367f3b5c972" exitCode=0 Mar 18 09:07:21 crc kubenswrapper[4778]: I0318 09:07:21.977021 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" event={"ID":"a4345a66-5037-444e-a1e8-c16f21fbdaca","Type":"ContainerDied","Data":"80caeba679f5d418af7acb193febae34c30e2113a5b3eda92aafc367f3b5c972"} Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.240789 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.333928 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir\") pod \"4c5110c4-3fed-4837-b17c-6578b2034f13\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.334448 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access\") pod \"4c5110c4-3fed-4837-b17c-6578b2034f13\" (UID: \"4c5110c4-3fed-4837-b17c-6578b2034f13\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.334048 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4c5110c4-3fed-4837-b17c-6578b2034f13" (UID: "4c5110c4-3fed-4837-b17c-6578b2034f13"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.334765 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c5110c4-3fed-4837-b17c-6578b2034f13-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.342282 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4c5110c4-3fed-4837-b17c-6578b2034f13" (UID: "4c5110c4-3fed-4837-b17c-6578b2034f13"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.435850 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c5110c4-3fed-4837-b17c-6578b2034f13-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.484147 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.537248 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config\") pod \"a4345a66-5037-444e-a1e8-c16f21fbdaca\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.537369 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca\") pod \"a4345a66-5037-444e-a1e8-c16f21fbdaca\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.537417 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert\") pod \"a4345a66-5037-444e-a1e8-c16f21fbdaca\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.537485 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv4jr\" (UniqueName: \"kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr\") pod \"a4345a66-5037-444e-a1e8-c16f21fbdaca\" (UID: \"a4345a66-5037-444e-a1e8-c16f21fbdaca\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.539884 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca" (OuterVolumeSpecName: "client-ca") pod "a4345a66-5037-444e-a1e8-c16f21fbdaca" (UID: "a4345a66-5037-444e-a1e8-c16f21fbdaca"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.539921 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config" (OuterVolumeSpecName: "config") pod "a4345a66-5037-444e-a1e8-c16f21fbdaca" (UID: "a4345a66-5037-444e-a1e8-c16f21fbdaca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.542557 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr" (OuterVolumeSpecName: "kube-api-access-fv4jr") pod "a4345a66-5037-444e-a1e8-c16f21fbdaca" (UID: "a4345a66-5037-444e-a1e8-c16f21fbdaca"). InnerVolumeSpecName "kube-api-access-fv4jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.548535 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a4345a66-5037-444e-a1e8-c16f21fbdaca" (UID: "a4345a66-5037-444e-a1e8-c16f21fbdaca"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.578276 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.638847 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca\") pod \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.638977 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rrfs\" (UniqueName: \"kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs\") pod \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639018 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config\") pod \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639091 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert\") pod \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639138 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles\") pod \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\" (UID: \"b6d27b9b-6d87-4aa8-abee-5d0323e96304\") " Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639469 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv4jr\" (UniqueName: \"kubernetes.io/projected/a4345a66-5037-444e-a1e8-c16f21fbdaca-kube-api-access-fv4jr\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639493 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639503 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4345a66-5037-444e-a1e8-c16f21fbdaca-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.639512 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4345a66-5037-444e-a1e8-c16f21fbdaca-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.640615 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b6d27b9b-6d87-4aa8-abee-5d0323e96304" (UID: "b6d27b9b-6d87-4aa8-abee-5d0323e96304"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.641037 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca" (OuterVolumeSpecName: "client-ca") pod "b6d27b9b-6d87-4aa8-abee-5d0323e96304" (UID: "b6d27b9b-6d87-4aa8-abee-5d0323e96304"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.642450 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config" (OuterVolumeSpecName: "config") pod "b6d27b9b-6d87-4aa8-abee-5d0323e96304" (UID: "b6d27b9b-6d87-4aa8-abee-5d0323e96304"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.646335 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs" (OuterVolumeSpecName: "kube-api-access-6rrfs") pod "b6d27b9b-6d87-4aa8-abee-5d0323e96304" (UID: "b6d27b9b-6d87-4aa8-abee-5d0323e96304"). InnerVolumeSpecName "kube-api-access-6rrfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.661495 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b6d27b9b-6d87-4aa8-abee-5d0323e96304" (UID: "b6d27b9b-6d87-4aa8-abee-5d0323e96304"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.740819 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6d27b9b-6d87-4aa8-abee-5d0323e96304-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.740908 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.740921 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.740930 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rrfs\" (UniqueName: \"kubernetes.io/projected/b6d27b9b-6d87-4aa8-abee-5d0323e96304-kube-api-access-6rrfs\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.740965 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d27b9b-6d87-4aa8-abee-5d0323e96304-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.988781 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerStarted","Data":"6d51e3ef30717618dbfd60d2700ddf309051ceaf03fe8bc0561397808fdc4760"} Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.993643 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563744-btdt7" event={"ID":"54961f10-93b0-433f-8a7d-b30d69178e9a","Type":"ContainerStarted","Data":"2aaa498f349b23cf7a4f0fb9da41ba553f76ed88636548c40f4f1cf1a8220b22"} Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.996798 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" event={"ID":"b6d27b9b-6d87-4aa8-abee-5d0323e96304","Type":"ContainerDied","Data":"a122f5ee8353a3e56d947f8d425d42ac2f3e6f348d5fc80375524cfbc8e649c9"} Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.996827 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56dc87ff66-nlfnr" Mar 18 09:07:22 crc kubenswrapper[4778]: I0318 09:07:22.996870 4778 scope.go:117] "RemoveContainer" containerID="9dc0270e3833a9e7f231ec92032b5254e8de6e37228c467052e88b73ed8dd1ef" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.002353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerStarted","Data":"afc4a86783668e2ef48e83935f1494b68e42a6641cb083187f72530915bb720f"} Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.007499 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563746-b66f7" event={"ID":"c3be356e-94af-47db-a182-dd8a57024619","Type":"ContainerStarted","Data":"cf4a9ddbe48af9c3f976ba168fa13253c79814734a6e5e0e3ef5fa348e79df80"} Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.025501 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerStarted","Data":"de8929b794bb5d2a7228965e1493c7e14a2590362f85b344f68eb15fc61bd4bf"} Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.031591 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" event={"ID":"a4345a66-5037-444e-a1e8-c16f21fbdaca","Type":"ContainerDied","Data":"7757ee38844688929de1c12404b0d4e708b8a04f3d6ee943e8d293f97184928f"} Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.031734 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.035643 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4c5110c4-3fed-4837-b17c-6578b2034f13","Type":"ContainerDied","Data":"e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4"} Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.035671 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4ffa2d08ea1bfd2b7fb4d426ddda63210ee2a45b84bf36efb4b50a85751c1c4" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.035716 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.037183 4778 scope.go:117] "RemoveContainer" containerID="80caeba679f5d418af7acb193febae34c30e2113a5b3eda92aafc367f3b5c972" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.041517 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tbbtb" podStartSLOduration=3.451476477 podStartE2EDuration="47.041489973s" podCreationTimestamp="2026-03-18 09:06:36 +0000 UTC" firstStartedPulling="2026-03-18 09:06:39.112018937 +0000 UTC m=+265.686763767" lastFinishedPulling="2026-03-18 09:07:22.702032423 +0000 UTC m=+309.276777263" observedRunningTime="2026-03-18 09:07:23.019634734 +0000 UTC m=+309.594379574" watchObservedRunningTime="2026-03-18 09:07:23.041489973 +0000 UTC m=+309.616234843" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.042429 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563744-btdt7" podStartSLOduration=152.665405137 podStartE2EDuration="3m23.042419119s" podCreationTimestamp="2026-03-18 09:04:00 +0000 UTC" firstStartedPulling="2026-03-18 09:06:32.037264405 +0000 UTC m=+258.612009245" lastFinishedPulling="2026-03-18 09:07:22.414278387 +0000 UTC m=+308.989023227" observedRunningTime="2026-03-18 09:07:23.037476139 +0000 UTC m=+309.612220999" watchObservedRunningTime="2026-03-18 09:07:23.042419119 +0000 UTC m=+309.617163959" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.061378 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563746-b66f7" podStartSLOduration=32.830250131 podStartE2EDuration="1m23.061358116s" podCreationTimestamp="2026-03-18 09:06:00 +0000 UTC" firstStartedPulling="2026-03-18 09:06:32.036186506 +0000 UTC m=+258.610931346" lastFinishedPulling="2026-03-18 09:07:22.267294491 +0000 UTC m=+308.842039331" observedRunningTime="2026-03-18 09:07:23.053358589 +0000 UTC m=+309.628103419" watchObservedRunningTime="2026-03-18 09:07:23.061358116 +0000 UTC m=+309.636102956" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.075696 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lzrtd" podStartSLOduration=2.429990659 podStartE2EDuration="46.075673392s" podCreationTimestamp="2026-03-18 09:06:37 +0000 UTC" firstStartedPulling="2026-03-18 09:06:39.144840117 +0000 UTC m=+265.719584967" lastFinishedPulling="2026-03-18 09:07:22.79052286 +0000 UTC m=+309.365267700" observedRunningTime="2026-03-18 09:07:23.075055225 +0000 UTC m=+309.649800065" watchObservedRunningTime="2026-03-18 09:07:23.075673392 +0000 UTC m=+309.650418232" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.093410 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.096942 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659fff47c9-56v2w"] Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.167257 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zwknx" podStartSLOduration=2.6196814760000002 podStartE2EDuration="46.167228067s" podCreationTimestamp="2026-03-18 09:06:37 +0000 UTC" firstStartedPulling="2026-03-18 09:06:39.153072221 +0000 UTC m=+265.727817071" lastFinishedPulling="2026-03-18 09:07:22.700618822 +0000 UTC m=+309.275363662" observedRunningTime="2026-03-18 09:07:23.15749652 +0000 UTC m=+309.732241380" watchObservedRunningTime="2026-03-18 09:07:23.167228067 +0000 UTC m=+309.741972917" Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.172101 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.178390 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56dc87ff66-nlfnr"] Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.221689 4778 csr.go:261] certificate signing request csr-zckbj is approved, waiting to be issued Mar 18 09:07:23 crc kubenswrapper[4778]: I0318 09:07:23.229628 4778 csr.go:257] certificate signing request csr-zckbj is issued Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.047923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerStarted","Data":"70cbb8df67b66047dc2936fa97099c75389be5914cb70774619b80a3c1ca3b41"} Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.050003 4778 generic.go:334] "Generic (PLEG): container finished" podID="c3be356e-94af-47db-a182-dd8a57024619" containerID="cf4a9ddbe48af9c3f976ba168fa13253c79814734a6e5e0e3ef5fa348e79df80" exitCode=0 Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.050122 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563746-b66f7" event={"ID":"c3be356e-94af-47db-a182-dd8a57024619","Type":"ContainerDied","Data":"cf4a9ddbe48af9c3f976ba168fa13253c79814734a6e5e0e3ef5fa348e79df80"} Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.053516 4778 generic.go:334] "Generic (PLEG): container finished" podID="54961f10-93b0-433f-8a7d-b30d69178e9a" containerID="2aaa498f349b23cf7a4f0fb9da41ba553f76ed88636548c40f4f1cf1a8220b22" exitCode=0 Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.053584 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563744-btdt7" event={"ID":"54961f10-93b0-433f-8a7d-b30d69178e9a","Type":"ContainerDied","Data":"2aaa498f349b23cf7a4f0fb9da41ba553f76ed88636548c40f4f1cf1a8220b22"} Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.068805 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qvn4w" podStartSLOduration=3.253544468 podStartE2EDuration="47.068787667s" podCreationTimestamp="2026-03-18 09:06:37 +0000 UTC" firstStartedPulling="2026-03-18 09:06:39.123403949 +0000 UTC m=+265.698148789" lastFinishedPulling="2026-03-18 09:07:22.938647138 +0000 UTC m=+309.513391988" observedRunningTime="2026-03-18 09:07:24.067943333 +0000 UTC m=+310.642688173" watchObservedRunningTime="2026-03-18 09:07:24.068787667 +0000 UTC m=+310.643532507" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.194347 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4345a66-5037-444e-a1e8-c16f21fbdaca" path="/var/lib/kubelet/pods/a4345a66-5037-444e-a1e8-c16f21fbdaca/volumes" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.195010 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" path="/var/lib/kubelet/pods/b6d27b9b-6d87-4aa8-abee-5d0323e96304/volumes" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.231995 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-11 15:03:05.394962459 +0000 UTC Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.232090 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7181h55m41.162875563s for next certificate rotation Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448391 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 09:07:24 crc kubenswrapper[4778]: E0318 09:07:24.448653 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5110c4-3fed-4837-b17c-6578b2034f13" containerName="pruner" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448668 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5110c4-3fed-4837-b17c-6578b2034f13" containerName="pruner" Mar 18 09:07:24 crc kubenswrapper[4778]: E0318 09:07:24.448687 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" containerName="controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448696 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" containerName="controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: E0318 09:07:24.448716 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4345a66-5037-444e-a1e8-c16f21fbdaca" containerName="route-controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448724 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4345a66-5037-444e-a1e8-c16f21fbdaca" containerName="route-controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448850 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c5110c4-3fed-4837-b17c-6578b2034f13" containerName="pruner" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448862 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d27b9b-6d87-4aa8-abee-5d0323e96304" containerName="controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.448870 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4345a66-5037-444e-a1e8-c16f21fbdaca" containerName="route-controller-manager" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.449307 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.451844 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.454385 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.464402 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.470171 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.470508 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.470589 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.571370 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.571436 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.571487 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.571570 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.571623 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.598980 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access\") pod \"installer-9-crc\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.764110 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.811761 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.812655 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.820159 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.822011 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.834564 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.837143 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.837613 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.837805 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.840969 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.841208 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.841359 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.841713 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.843548 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.843611 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.844186 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.844399 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.852277 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.862257 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.862776 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.980307 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.980874 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.980949 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.981076 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txfl6\" (UniqueName: \"kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.981105 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.981258 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.981329 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.981360 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:24 crc kubenswrapper[4778]: I0318 09:07:24.982314 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8mw9\" (UniqueName: \"kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085090 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txfl6\" (UniqueName: \"kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085145 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085244 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085302 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085328 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8mw9\" (UniqueName: \"kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085376 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.085460 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.086746 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.087956 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.090032 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.091834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.093351 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.094228 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.095028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.107482 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txfl6\" (UniqueName: \"kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6\") pod \"controller-manager-785666b9f5-xt96c\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.110903 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8mw9\" (UniqueName: \"kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9\") pod \"route-controller-manager-65bcdc9c9c-kwh24\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.173483 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.182152 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.237351 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-20 07:23:31.301105393 +0000 UTC Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.237408 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7390h16m6.063700246s for next certificate rotation Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.309075 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.353740 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.441243 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.491418 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjwps\" (UniqueName: \"kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps\") pod \"54961f10-93b0-433f-8a7d-b30d69178e9a\" (UID: \"54961f10-93b0-433f-8a7d-b30d69178e9a\") " Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.491537 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55w8z\" (UniqueName: \"kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z\") pod \"c3be356e-94af-47db-a182-dd8a57024619\" (UID: \"c3be356e-94af-47db-a182-dd8a57024619\") " Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.503798 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z" (OuterVolumeSpecName: "kube-api-access-55w8z") pod "c3be356e-94af-47db-a182-dd8a57024619" (UID: "c3be356e-94af-47db-a182-dd8a57024619"). InnerVolumeSpecName "kube-api-access-55w8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.507627 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps" (OuterVolumeSpecName: "kube-api-access-zjwps") pod "54961f10-93b0-433f-8a7d-b30d69178e9a" (UID: "54961f10-93b0-433f-8a7d-b30d69178e9a"). InnerVolumeSpecName "kube-api-access-zjwps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.592373 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjwps\" (UniqueName: \"kubernetes.io/projected/54961f10-93b0-433f-8a7d-b30d69178e9a-kube-api-access-zjwps\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.592412 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55w8z\" (UniqueName: \"kubernetes.io/projected/c3be356e-94af-47db-a182-dd8a57024619-kube-api-access-55w8z\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.595159 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:25 crc kubenswrapper[4778]: I0318 09:07:25.735319 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.073041 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-b66f7" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.073030 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563746-b66f7" event={"ID":"c3be356e-94af-47db-a182-dd8a57024619","Type":"ContainerDied","Data":"4d957b42c20ebb120c0681574b93e0b852f5977f6c96c78d95883a927b1e8844"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.073229 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d957b42c20ebb120c0681574b93e0b852f5977f6c96c78d95883a927b1e8844" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.085261 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"93723e0d-2243-4390-a667-8a080325205f","Type":"ContainerStarted","Data":"e2b50472335d82d0cbf47f1b30666ae7b50519462f89aadf15ef33cf94804236"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.085300 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"93723e0d-2243-4390-a667-8a080325205f","Type":"ContainerStarted","Data":"e6f7bb7ae014c0c468ab5d5f0830ec452ff1c1962710492d6af9f507d40acc53"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.089550 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563744-btdt7" event={"ID":"54961f10-93b0-433f-8a7d-b30d69178e9a","Type":"ContainerDied","Data":"df77c4671fb6dc8dc3716ac3d7733190f2f2696ab30319657174e00cec76ec77"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.089624 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df77c4671fb6dc8dc3716ac3d7733190f2f2696ab30319657174e00cec76ec77" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.089570 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-btdt7" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.090967 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" event={"ID":"180e4f84-52ed-4db3-aec6-c724becfadf1","Type":"ContainerStarted","Data":"6182c0f78e6f784c954f3d716e1ee189ef539115eb9b295b92f5ba494516c3c6"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.091042 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" event={"ID":"180e4f84-52ed-4db3-aec6-c724becfadf1","Type":"ContainerStarted","Data":"9451f090d87a54fcdbbfb38d7aba157bfa9ee730f5dff506b43c409b10178b80"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.091744 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.093035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" event={"ID":"2cd169c1-e595-4497-856c-3dd27c1cf551","Type":"ContainerStarted","Data":"218eebc0b353408962cb4ccc7a6dbcb7e94147122bce8e36e5a54e046e5b06f8"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.093063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" event={"ID":"2cd169c1-e595-4497-856c-3dd27c1cf551","Type":"ContainerStarted","Data":"fd1c635644f48a9fb2e1288c201f66a5f81d58f274081aff739ae76932d747eb"} Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.093637 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.112425 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.121528 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.121513343 podStartE2EDuration="2.121513343s" podCreationTimestamp="2026-03-18 09:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:26.118095557 +0000 UTC m=+312.692840427" watchObservedRunningTime="2026-03-18 09:07:26.121513343 +0000 UTC m=+312.696258183" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.161250 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" podStartSLOduration=11.161220468 podStartE2EDuration="11.161220468s" podCreationTimestamp="2026-03-18 09:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:26.158430519 +0000 UTC m=+312.733175379" watchObservedRunningTime="2026-03-18 09:07:26.161220468 +0000 UTC m=+312.735965318" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.184562 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" podStartSLOduration=11.18454217 podStartE2EDuration="11.18454217s" podCreationTimestamp="2026-03-18 09:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:26.179987211 +0000 UTC m=+312.754732061" watchObservedRunningTime="2026-03-18 09:07:26.18454217 +0000 UTC m=+312.759287010" Mar 18 09:07:26 crc kubenswrapper[4778]: I0318 09:07:26.280387 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.309356 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.309413 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.541700 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.542794 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.675281 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.675427 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.707900 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.708392 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:07:27 crc kubenswrapper[4778]: I0318 09:07:27.722526 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.117672 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.117716 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.163379 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.183717 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.185544 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.202704 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:28 crc kubenswrapper[4778]: I0318 09:07:28.394652 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:07:29 crc kubenswrapper[4778]: I0318 09:07:29.157839 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.147617 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.147669 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.147717 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.148242 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.148313 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc" gracePeriod=600 Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.161972 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.367600 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:07:30 crc kubenswrapper[4778]: I0318 09:07:30.367935 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lzrtd" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="registry-server" containerID="cri-o://afc4a86783668e2ef48e83935f1494b68e42a6641cb083187f72530915bb720f" gracePeriod=2 Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.128293 4778 generic.go:334] "Generic (PLEG): container finished" podID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerID="afc4a86783668e2ef48e83935f1494b68e42a6641cb083187f72530915bb720f" exitCode=0 Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.128388 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerDied","Data":"afc4a86783668e2ef48e83935f1494b68e42a6641cb083187f72530915bb720f"} Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.131048 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc" exitCode=0 Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.131421 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zwknx" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="registry-server" containerID="cri-o://de8929b794bb5d2a7228965e1493c7e14a2590362f85b344f68eb15fc61bd4bf" gracePeriod=2 Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.131185 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc"} Mar 18 09:07:31 crc kubenswrapper[4778]: I0318 09:07:31.941315 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.018355 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfhg5\" (UniqueName: \"kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5\") pod \"3618fc0f-e8b2-4476-a24d-662165a04ecc\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.018738 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities\") pod \"3618fc0f-e8b2-4476-a24d-662165a04ecc\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.018821 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content\") pod \"3618fc0f-e8b2-4476-a24d-662165a04ecc\" (UID: \"3618fc0f-e8b2-4476-a24d-662165a04ecc\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.019692 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities" (OuterVolumeSpecName: "utilities") pod "3618fc0f-e8b2-4476-a24d-662165a04ecc" (UID: "3618fc0f-e8b2-4476-a24d-662165a04ecc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.024544 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5" (OuterVolumeSpecName: "kube-api-access-nfhg5") pod "3618fc0f-e8b2-4476-a24d-662165a04ecc" (UID: "3618fc0f-e8b2-4476-a24d-662165a04ecc"). InnerVolumeSpecName "kube-api-access-nfhg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.093082 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3618fc0f-e8b2-4476-a24d-662165a04ecc" (UID: "3618fc0f-e8b2-4476-a24d-662165a04ecc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.121054 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfhg5\" (UniqueName: \"kubernetes.io/projected/3618fc0f-e8b2-4476-a24d-662165a04ecc-kube-api-access-nfhg5\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.121098 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.121112 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3618fc0f-e8b2-4476-a24d-662165a04ecc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.142307 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzrtd" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.142292 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzrtd" event={"ID":"3618fc0f-e8b2-4476-a24d-662165a04ecc","Type":"ContainerDied","Data":"f49ddb65c41adfb45d65de1198b0f82574583119a6d4929d1ffa55ce9f770dcc"} Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.142473 4778 scope.go:117] "RemoveContainer" containerID="afc4a86783668e2ef48e83935f1494b68e42a6641cb083187f72530915bb720f" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.145128 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07"} Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.150141 4778 generic.go:334] "Generic (PLEG): container finished" podID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerID="de8929b794bb5d2a7228965e1493c7e14a2590362f85b344f68eb15fc61bd4bf" exitCode=0 Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.150207 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerDied","Data":"de8929b794bb5d2a7228965e1493c7e14a2590362f85b344f68eb15fc61bd4bf"} Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.150229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwknx" event={"ID":"23b7607d-fa16-45c1-a0cb-c5ec39a288fb","Type":"ContainerDied","Data":"fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267"} Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.150241 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd97a190415e4d1219ea6675fa83baba068c5459e6a2c0b39450f9d62ba9f267" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.163972 4778 scope.go:117] "RemoveContainer" containerID="db7b16ab120c184db5ebadc2caf608fa9242b9a332050072ad2cae2fba3722b7" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.169840 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.207860 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.207904 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lzrtd"] Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.210792 4778 scope.go:117] "RemoveContainer" containerID="33f31c9dc67a1137fd25116299f097d08a1ad1b4a3924bc1eb5ff8d0db0c9727" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.323357 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mff6k\" (UniqueName: \"kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k\") pod \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.323409 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content\") pod \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.323437 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities\") pod \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\" (UID: \"23b7607d-fa16-45c1-a0cb-c5ec39a288fb\") " Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.324524 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities" (OuterVolumeSpecName: "utilities") pod "23b7607d-fa16-45c1-a0cb-c5ec39a288fb" (UID: "23b7607d-fa16-45c1-a0cb-c5ec39a288fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.326251 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k" (OuterVolumeSpecName: "kube-api-access-mff6k") pod "23b7607d-fa16-45c1-a0cb-c5ec39a288fb" (UID: "23b7607d-fa16-45c1-a0cb-c5ec39a288fb"). InnerVolumeSpecName "kube-api-access-mff6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.374932 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23b7607d-fa16-45c1-a0cb-c5ec39a288fb" (UID: "23b7607d-fa16-45c1-a0cb-c5ec39a288fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.425043 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mff6k\" (UniqueName: \"kubernetes.io/projected/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-kube-api-access-mff6k\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.425103 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:32 crc kubenswrapper[4778]: I0318 09:07:32.425122 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b7607d-fa16-45c1-a0cb-c5ec39a288fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:33 crc kubenswrapper[4778]: I0318 09:07:33.164781 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwknx" Mar 18 09:07:33 crc kubenswrapper[4778]: I0318 09:07:33.209965 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:07:33 crc kubenswrapper[4778]: I0318 09:07:33.214644 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zwknx"] Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.172963 4778 generic.go:334] "Generic (PLEG): container finished" podID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerID="dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea" exitCode=0 Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.173384 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerDied","Data":"dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea"} Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.179276 4778 generic.go:334] "Generic (PLEG): container finished" podID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerID="f49cf5ea04db3604b7012853be48f57eabfbbf2919ff145d883ab1c07e04a460" exitCode=0 Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.179384 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerDied","Data":"f49cf5ea04db3604b7012853be48f57eabfbbf2919ff145d883ab1c07e04a460"} Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.183544 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerStarted","Data":"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731"} Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.197566 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" path="/var/lib/kubelet/pods/23b7607d-fa16-45c1-a0cb-c5ec39a288fb/volumes" Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.199446 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" path="/var/lib/kubelet/pods/3618fc0f-e8b2-4476-a24d-662165a04ecc/volumes" Mar 18 09:07:34 crc kubenswrapper[4778]: I0318 09:07:34.200323 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerStarted","Data":"d3a1ccac938944a3f089f8c92f9aebbf40c8042f08e26dece0839acbb161f095"} Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.217554 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerStarted","Data":"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be"} Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.220917 4778 generic.go:334] "Generic (PLEG): container finished" podID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerID="fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731" exitCode=0 Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.221022 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerDied","Data":"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731"} Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.225225 4778 generic.go:334] "Generic (PLEG): container finished" podID="938982a6-57b0-4870-abed-a98c42196ae6" containerID="d3a1ccac938944a3f089f8c92f9aebbf40c8042f08e26dece0839acbb161f095" exitCode=0 Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.225356 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerDied","Data":"d3a1ccac938944a3f089f8c92f9aebbf40c8042f08e26dece0839acbb161f095"} Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.283843 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rscg9" podStartSLOduration=3.024096878 podStartE2EDuration="56.28381566s" podCreationTimestamp="2026-03-18 09:06:39 +0000 UTC" firstStartedPulling="2026-03-18 09:06:41.358542745 +0000 UTC m=+267.933287585" lastFinishedPulling="2026-03-18 09:07:34.618261527 +0000 UTC m=+321.193006367" observedRunningTime="2026-03-18 09:07:35.251520034 +0000 UTC m=+321.826264894" watchObservedRunningTime="2026-03-18 09:07:35.28381566 +0000 UTC m=+321.858560540" Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.681288 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.682020 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" podUID="2cd169c1-e595-4497-856c-3dd27c1cf551" containerName="controller-manager" containerID="cri-o://218eebc0b353408962cb4ccc7a6dbcb7e94147122bce8e36e5a54e046e5b06f8" gracePeriod=30 Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.718215 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:35 crc kubenswrapper[4778]: I0318 09:07:35.718453 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" podUID="180e4f84-52ed-4db3-aec6-c724becfadf1" containerName="route-controller-manager" containerID="cri-o://6182c0f78e6f784c954f3d716e1ee189ef539115eb9b295b92f5ba494516c3c6" gracePeriod=30 Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.233650 4778 generic.go:334] "Generic (PLEG): container finished" podID="180e4f84-52ed-4db3-aec6-c724becfadf1" containerID="6182c0f78e6f784c954f3d716e1ee189ef539115eb9b295b92f5ba494516c3c6" exitCode=0 Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.233734 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" event={"ID":"180e4f84-52ed-4db3-aec6-c724becfadf1","Type":"ContainerDied","Data":"6182c0f78e6f784c954f3d716e1ee189ef539115eb9b295b92f5ba494516c3c6"} Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.235735 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" event={"ID":"180e4f84-52ed-4db3-aec6-c724becfadf1","Type":"ContainerDied","Data":"9451f090d87a54fcdbbfb38d7aba157bfa9ee730f5dff506b43c409b10178b80"} Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.235823 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9451f090d87a54fcdbbfb38d7aba157bfa9ee730f5dff506b43c409b10178b80" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.237352 4778 generic.go:334] "Generic (PLEG): container finished" podID="2cd169c1-e595-4497-856c-3dd27c1cf551" containerID="218eebc0b353408962cb4ccc7a6dbcb7e94147122bce8e36e5a54e046e5b06f8" exitCode=0 Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.237429 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" event={"ID":"2cd169c1-e595-4497-856c-3dd27c1cf551","Type":"ContainerDied","Data":"218eebc0b353408962cb4ccc7a6dbcb7e94147122bce8e36e5a54e046e5b06f8"} Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.239248 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerStarted","Data":"35627094815d8aab35b2d89ade00f4db12d2982f44d18065d3e2b7a8cff620a7"} Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.240882 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerStarted","Data":"079c4231aac10f3612c674a07d9729d7a0fd2d54be2669f250d2d9ca937b5439"} Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.263793 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6qgm2" podStartSLOduration=2.338288062 podStartE2EDuration="57.263773043s" podCreationTimestamp="2026-03-18 09:06:39 +0000 UTC" firstStartedPulling="2026-03-18 09:06:40.190029079 +0000 UTC m=+266.764773919" lastFinishedPulling="2026-03-18 09:07:35.11551405 +0000 UTC m=+321.690258900" observedRunningTime="2026-03-18 09:07:36.259746038 +0000 UTC m=+322.834490878" watchObservedRunningTime="2026-03-18 09:07:36.263773043 +0000 UTC m=+322.838517883" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.269292 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.286502 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6kvnk" podStartSLOduration=3.048662096 podStartE2EDuration="56.286465426s" podCreationTimestamp="2026-03-18 09:06:40 +0000 UTC" firstStartedPulling="2026-03-18 09:06:42.488354656 +0000 UTC m=+269.063099496" lastFinishedPulling="2026-03-18 09:07:35.726157986 +0000 UTC m=+322.300902826" observedRunningTime="2026-03-18 09:07:36.284417478 +0000 UTC m=+322.859162318" watchObservedRunningTime="2026-03-18 09:07:36.286465426 +0000 UTC m=+322.861210266" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.413102 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert\") pod \"180e4f84-52ed-4db3-aec6-c724becfadf1\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.413185 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca\") pod \"180e4f84-52ed-4db3-aec6-c724becfadf1\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.413358 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8mw9\" (UniqueName: \"kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9\") pod \"180e4f84-52ed-4db3-aec6-c724becfadf1\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.413393 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config\") pod \"180e4f84-52ed-4db3-aec6-c724becfadf1\" (UID: \"180e4f84-52ed-4db3-aec6-c724becfadf1\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.414334 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca" (OuterVolumeSpecName: "client-ca") pod "180e4f84-52ed-4db3-aec6-c724becfadf1" (UID: "180e4f84-52ed-4db3-aec6-c724becfadf1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.414374 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config" (OuterVolumeSpecName: "config") pod "180e4f84-52ed-4db3-aec6-c724becfadf1" (UID: "180e4f84-52ed-4db3-aec6-c724becfadf1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.414699 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.414746 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/180e4f84-52ed-4db3-aec6-c724becfadf1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.420662 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "180e4f84-52ed-4db3-aec6-c724becfadf1" (UID: "180e4f84-52ed-4db3-aec6-c724becfadf1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.420847 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9" (OuterVolumeSpecName: "kube-api-access-c8mw9") pod "180e4f84-52ed-4db3-aec6-c724becfadf1" (UID: "180e4f84-52ed-4db3-aec6-c724becfadf1"). InnerVolumeSpecName "kube-api-access-c8mw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.423866 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.515949 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/180e4f84-52ed-4db3-aec6-c724becfadf1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.515993 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8mw9\" (UniqueName: \"kubernetes.io/projected/180e4f84-52ed-4db3-aec6-c724becfadf1-kube-api-access-c8mw9\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.616730 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txfl6\" (UniqueName: \"kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6\") pod \"2cd169c1-e595-4497-856c-3dd27c1cf551\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.616826 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca\") pod \"2cd169c1-e595-4497-856c-3dd27c1cf551\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.616859 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles\") pod \"2cd169c1-e595-4497-856c-3dd27c1cf551\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.616896 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert\") pod \"2cd169c1-e595-4497-856c-3dd27c1cf551\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.617010 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config\") pod \"2cd169c1-e595-4497-856c-3dd27c1cf551\" (UID: \"2cd169c1-e595-4497-856c-3dd27c1cf551\") " Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.618141 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca" (OuterVolumeSpecName: "client-ca") pod "2cd169c1-e595-4497-856c-3dd27c1cf551" (UID: "2cd169c1-e595-4497-856c-3dd27c1cf551"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.618159 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2cd169c1-e595-4497-856c-3dd27c1cf551" (UID: "2cd169c1-e595-4497-856c-3dd27c1cf551"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.618236 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config" (OuterVolumeSpecName: "config") pod "2cd169c1-e595-4497-856c-3dd27c1cf551" (UID: "2cd169c1-e595-4497-856c-3dd27c1cf551"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.624446 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2cd169c1-e595-4497-856c-3dd27c1cf551" (UID: "2cd169c1-e595-4497-856c-3dd27c1cf551"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.624584 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6" (OuterVolumeSpecName: "kube-api-access-txfl6") pod "2cd169c1-e595-4497-856c-3dd27c1cf551" (UID: "2cd169c1-e595-4497-856c-3dd27c1cf551"). InnerVolumeSpecName "kube-api-access-txfl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.718447 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.718495 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txfl6\" (UniqueName: \"kubernetes.io/projected/2cd169c1-e595-4497-856c-3dd27c1cf551-kube-api-access-txfl6\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.718513 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.718530 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cd169c1-e595-4497-856c-3dd27c1cf551-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.718546 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cd169c1-e595-4497-856c-3dd27c1cf551-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.821246 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.821734 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd169c1-e595-4497-856c-3dd27c1cf551" containerName="controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.821814 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd169c1-e595-4497-856c-3dd27c1cf551" containerName="controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.821880 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.821937 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.821992 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="extract-utilities" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822047 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="extract-utilities" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822099 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180e4f84-52ed-4db3-aec6-c724becfadf1" containerName="route-controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822148 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="180e4f84-52ed-4db3-aec6-c724becfadf1" containerName="route-controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822221 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822275 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822343 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="extract-content" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822398 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="extract-content" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822458 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="extract-utilities" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822514 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="extract-utilities" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822647 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822707 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822769 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3be356e-94af-47db-a182-dd8a57024619" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822826 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3be356e-94af-47db-a182-dd8a57024619" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: E0318 09:07:36.822886 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="extract-content" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.822943 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="extract-content" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823097 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd169c1-e595-4497-856c-3dd27c1cf551" containerName="controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823165 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3618fc0f-e8b2-4476-a24d-662165a04ecc" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823268 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b7607d-fa16-45c1-a0cb-c5ec39a288fb" containerName="registry-server" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823324 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823384 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3be356e-94af-47db-a182-dd8a57024619" containerName="oc" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823439 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="180e4f84-52ed-4db3-aec6-c724becfadf1" containerName="route-controller-manager" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.823917 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.828785 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.837996 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.850428 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:36 crc kubenswrapper[4778]: I0318 09:07:36.862027 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.022910 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.023422 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26k95\" (UniqueName: \"kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.023667 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.023795 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.023932 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.024048 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsn88\" (UniqueName: \"kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.024254 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.024313 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.024346 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125823 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125854 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125878 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsn88\" (UniqueName: \"kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125905 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125922 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125939 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.125983 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.126001 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26k95\" (UniqueName: \"kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.127019 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.127069 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.127253 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.127599 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.128502 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.129941 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.130631 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.148329 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsn88\" (UniqueName: \"kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88\") pod \"controller-manager-66b77f46fb-47b5r\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.150673 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26k95\" (UniqueName: \"kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95\") pod \"route-controller-manager-69f98775dd-rk74w\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.167512 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.175994 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.249038 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerStarted","Data":"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568"} Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.252971 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.253418 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.253411 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785666b9f5-xt96c" event={"ID":"2cd169c1-e595-4497-856c-3dd27c1cf551","Type":"ContainerDied","Data":"fd1c635644f48a9fb2e1288c201f66a5f81d58f274081aff739ae76932d747eb"} Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.253633 4778 scope.go:117] "RemoveContainer" containerID="218eebc0b353408962cb4ccc7a6dbcb7e94147122bce8e36e5a54e046e5b06f8" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.290946 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t74gc" podStartSLOduration=3.641752263 podStartE2EDuration="57.290911592s" podCreationTimestamp="2026-03-18 09:06:40 +0000 UTC" firstStartedPulling="2026-03-18 09:06:42.441171998 +0000 UTC m=+269.015916838" lastFinishedPulling="2026-03-18 09:07:36.090331327 +0000 UTC m=+322.665076167" observedRunningTime="2026-03-18 09:07:37.288664148 +0000 UTC m=+323.863408998" watchObservedRunningTime="2026-03-18 09:07:37.290911592 +0000 UTC m=+323.865656432" Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.369351 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.381301 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65bcdc9c9c-kwh24"] Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.392224 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.396567 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-785666b9f5-xt96c"] Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.680024 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:37 crc kubenswrapper[4778]: W0318 09:07:37.691675 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aae8a16_f704_4764_bfd7_7a0cfed2eee3.slice/crio-a7918f18849ba2e79d4ef72f4ad9306fd3cd739ca9880924f05fcd02b77c8d3d WatchSource:0}: Error finding container a7918f18849ba2e79d4ef72f4ad9306fd3cd739ca9880924f05fcd02b77c8d3d: Status 404 returned error can't find the container with id a7918f18849ba2e79d4ef72f4ad9306fd3cd739ca9880924f05fcd02b77c8d3d Mar 18 09:07:37 crc kubenswrapper[4778]: I0318 09:07:37.749594 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.195383 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="180e4f84-52ed-4db3-aec6-c724becfadf1" path="/var/lib/kubelet/pods/180e4f84-52ed-4db3-aec6-c724becfadf1/volumes" Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.196183 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd169c1-e595-4497-856c-3dd27c1cf551" path="/var/lib/kubelet/pods/2cd169c1-e595-4497-856c-3dd27c1cf551/volumes" Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.259044 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" event={"ID":"532bf41b-51fb-4815-ab26-8fb2d12526d2","Type":"ContainerStarted","Data":"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064"} Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.259105 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" event={"ID":"532bf41b-51fb-4815-ab26-8fb2d12526d2","Type":"ContainerStarted","Data":"2f81410394a4a7c6f801aac32f2e683ca0e93e74a4322b2f6a48fc440a4c2e61"} Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.259262 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.261766 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" event={"ID":"4aae8a16-f704-4764-bfd7-7a0cfed2eee3","Type":"ContainerStarted","Data":"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9"} Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.261804 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" event={"ID":"4aae8a16-f704-4764-bfd7-7a0cfed2eee3","Type":"ContainerStarted","Data":"a7918f18849ba2e79d4ef72f4ad9306fd3cd739ca9880924f05fcd02b77c8d3d"} Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.265436 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.276274 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" podStartSLOduration=3.276256608 podStartE2EDuration="3.276256608s" podCreationTimestamp="2026-03-18 09:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:38.276250988 +0000 UTC m=+324.850995838" watchObservedRunningTime="2026-03-18 09:07:38.276256608 +0000 UTC m=+324.851001448" Mar 18 09:07:38 crc kubenswrapper[4778]: I0318 09:07:38.314589 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" podStartSLOduration=3.314568244 podStartE2EDuration="3.314568244s" podCreationTimestamp="2026-03-18 09:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:38.312168036 +0000 UTC m=+324.886912886" watchObservedRunningTime="2026-03-18 09:07:38.314568244 +0000 UTC m=+324.889313084" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.268634 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.274944 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.402891 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.402959 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.445428 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.820379 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.820781 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:39 crc kubenswrapper[4778]: I0318 09:07:39.872903 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.283809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.284078 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.284123 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.284516 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.286612 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.286784 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.287366 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.298474 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.298901 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.311735 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.314757 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.316267 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.320437 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.320859 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.320895 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.334663 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.344972 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.483079 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.483123 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:07:40 crc kubenswrapper[4778]: W0318 09:07:40.608032 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-2a7beaa4634563e06d6bda6369bf6d700ba94f2f182d3fa40d2c50908dc98c55 WatchSource:0}: Error finding container 2a7beaa4634563e06d6bda6369bf6d700ba94f2f182d3fa40d2c50908dc98c55: Status 404 returned error can't find the container with id 2a7beaa4634563e06d6bda6369bf6d700ba94f2f182d3fa40d2c50908dc98c55 Mar 18 09:07:40 crc kubenswrapper[4778]: W0318 09:07:40.850866 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-8c22cc8478d5b3d363a57fe9254a569605c448ef5cd0effed66a11a35b944d72 WatchSource:0}: Error finding container 8c22cc8478d5b3d363a57fe9254a569605c448ef5cd0effed66a11a35b944d72: Status 404 returned error can't find the container with id 8c22cc8478d5b3d363a57fe9254a569605c448ef5cd0effed66a11a35b944d72 Mar 18 09:07:40 crc kubenswrapper[4778]: W0318 09:07:40.873804 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-14d592f01bdff8b8724bfdc1105b2472de2fc9986350a53a853426fe619c6b66 WatchSource:0}: Error finding container 14d592f01bdff8b8724bfdc1105b2472de2fc9986350a53a853426fe619c6b66: Status 404 returned error can't find the container with id 14d592f01bdff8b8724bfdc1105b2472de2fc9986350a53a853426fe619c6b66 Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.974901 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:40 crc kubenswrapper[4778]: I0318 09:07:40.974957 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.282080 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4fb785520bd9b67b63e2803cd0bbc614c5ddc34e3b3cb7ad83141c2fe7b8fa68"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.282137 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"14d592f01bdff8b8724bfdc1105b2472de2fc9986350a53a853426fe619c6b66"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.283500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"788b786676596d76f803e3247912acf20e74bd978280cc6d03e54295404de15b"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.283528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8c22cc8478d5b3d363a57fe9254a569605c448ef5cd0effed66a11a35b944d72"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.283711 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.284856 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2a6479e7a6f9ce91fa00c203b7c22ba9cca43f7b7469d035d135fd1874b5d511"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.284885 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2a7beaa4634563e06d6bda6369bf6d700ba94f2f182d3fa40d2c50908dc98c55"} Mar 18 09:07:41 crc kubenswrapper[4778]: I0318 09:07:41.552030 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6kvnk" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="registry-server" probeResult="failure" output=< Mar 18 09:07:41 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:07:41 crc kubenswrapper[4778]: > Mar 18 09:07:42 crc kubenswrapper[4778]: I0318 09:07:42.019047 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t74gc" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="registry-server" probeResult="failure" output=< Mar 18 09:07:42 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:07:42 crc kubenswrapper[4778]: > Mar 18 09:07:42 crc kubenswrapper[4778]: I0318 09:07:42.962135 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.296004 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rscg9" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="registry-server" containerID="cri-o://5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be" gracePeriod=2 Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.761274 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.938536 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnfhp\" (UniqueName: \"kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp\") pod \"3aedbf59-d23d-409e-9742-09824ed6ef2a\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.938632 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities\") pod \"3aedbf59-d23d-409e-9742-09824ed6ef2a\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.938730 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content\") pod \"3aedbf59-d23d-409e-9742-09824ed6ef2a\" (UID: \"3aedbf59-d23d-409e-9742-09824ed6ef2a\") " Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.939738 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities" (OuterVolumeSpecName: "utilities") pod "3aedbf59-d23d-409e-9742-09824ed6ef2a" (UID: "3aedbf59-d23d-409e-9742-09824ed6ef2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.946510 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp" (OuterVolumeSpecName: "kube-api-access-gnfhp") pod "3aedbf59-d23d-409e-9742-09824ed6ef2a" (UID: "3aedbf59-d23d-409e-9742-09824ed6ef2a"). InnerVolumeSpecName "kube-api-access-gnfhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.954086 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.954138 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnfhp\" (UniqueName: \"kubernetes.io/projected/3aedbf59-d23d-409e-9742-09824ed6ef2a-kube-api-access-gnfhp\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:43 crc kubenswrapper[4778]: I0318 09:07:43.997998 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3aedbf59-d23d-409e-9742-09824ed6ef2a" (UID: "3aedbf59-d23d-409e-9742-09824ed6ef2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.054843 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aedbf59-d23d-409e-9742-09824ed6ef2a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.308146 4778 generic.go:334] "Generic (PLEG): container finished" podID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerID="5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be" exitCode=0 Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.308223 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerDied","Data":"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be"} Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.308258 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rscg9" event={"ID":"3aedbf59-d23d-409e-9742-09824ed6ef2a","Type":"ContainerDied","Data":"535b754da1c090611910ad3734bf58ec9a371b36592f999639d434308dca9ac6"} Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.308279 4778 scope.go:117] "RemoveContainer" containerID="5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.308347 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rscg9" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.338006 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.343511 4778 scope.go:117] "RemoveContainer" containerID="dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.347431 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rscg9"] Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.364063 4778 scope.go:117] "RemoveContainer" containerID="1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.397066 4778 scope.go:117] "RemoveContainer" containerID="5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be" Mar 18 09:07:44 crc kubenswrapper[4778]: E0318 09:07:44.397963 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be\": container with ID starting with 5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be not found: ID does not exist" containerID="5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.398319 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be"} err="failed to get container status \"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be\": rpc error: code = NotFound desc = could not find container \"5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be\": container with ID starting with 5c63c49c60643b6494f00d61d3ef4ccb776fd2d6203e2a28b808f6ef108c94be not found: ID does not exist" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.398345 4778 scope.go:117] "RemoveContainer" containerID="dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea" Mar 18 09:07:44 crc kubenswrapper[4778]: E0318 09:07:44.398742 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea\": container with ID starting with dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea not found: ID does not exist" containerID="dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.398794 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea"} err="failed to get container status \"dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea\": rpc error: code = NotFound desc = could not find container \"dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea\": container with ID starting with dc4667a19e4c6f3862b650883697e1dc99e9e198f85259e0ecc62f74a4fc5fea not found: ID does not exist" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.398826 4778 scope.go:117] "RemoveContainer" containerID="1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58" Mar 18 09:07:44 crc kubenswrapper[4778]: E0318 09:07:44.399189 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58\": container with ID starting with 1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58 not found: ID does not exist" containerID="1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58" Mar 18 09:07:44 crc kubenswrapper[4778]: I0318 09:07:44.399287 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58"} err="failed to get container status \"1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58\": rpc error: code = NotFound desc = could not find container \"1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58\": container with ID starting with 1220cfd301077b03f3786adecfe3bf904ed326e9e0abd8d5eb040fe67007ec58 not found: ID does not exist" Mar 18 09:07:46 crc kubenswrapper[4778]: I0318 09:07:46.199475 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" path="/var/lib/kubelet/pods/3aedbf59-d23d-409e-9742-09824ed6ef2a/volumes" Mar 18 09:07:50 crc kubenswrapper[4778]: I0318 09:07:50.529241 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:07:50 crc kubenswrapper[4778]: I0318 09:07:50.585529 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:07:51 crc kubenswrapper[4778]: I0318 09:07:51.046622 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:51 crc kubenswrapper[4778]: I0318 09:07:51.095830 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:51 crc kubenswrapper[4778]: I0318 09:07:51.776989 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:07:52 crc kubenswrapper[4778]: I0318 09:07:52.366856 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t74gc" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="registry-server" containerID="cri-o://5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568" gracePeriod=2 Mar 18 09:07:52 crc kubenswrapper[4778]: I0318 09:07:52.902158 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:52 crc kubenswrapper[4778]: I0318 09:07:52.999174 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content\") pod \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " Mar 18 09:07:52 crc kubenswrapper[4778]: I0318 09:07:52.999404 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlv42\" (UniqueName: \"kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42\") pod \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " Mar 18 09:07:52 crc kubenswrapper[4778]: I0318 09:07:52.999538 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities\") pod \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\" (UID: \"01828fdf-ef1b-44e3-905b-aec0c6aaa44f\") " Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.000541 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities" (OuterVolumeSpecName: "utilities") pod "01828fdf-ef1b-44e3-905b-aec0c6aaa44f" (UID: "01828fdf-ef1b-44e3-905b-aec0c6aaa44f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.009003 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42" (OuterVolumeSpecName: "kube-api-access-dlv42") pod "01828fdf-ef1b-44e3-905b-aec0c6aaa44f" (UID: "01828fdf-ef1b-44e3-905b-aec0c6aaa44f"). InnerVolumeSpecName "kube-api-access-dlv42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.101415 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlv42\" (UniqueName: \"kubernetes.io/projected/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-kube-api-access-dlv42\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.101458 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.175405 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01828fdf-ef1b-44e3-905b-aec0c6aaa44f" (UID: "01828fdf-ef1b-44e3-905b-aec0c6aaa44f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.203164 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01828fdf-ef1b-44e3-905b-aec0c6aaa44f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.383958 4778 generic.go:334] "Generic (PLEG): container finished" podID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerID="5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568" exitCode=0 Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.384051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerDied","Data":"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568"} Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.384115 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t74gc" event={"ID":"01828fdf-ef1b-44e3-905b-aec0c6aaa44f","Type":"ContainerDied","Data":"3806b4b94100c7a5155593ac6313aa0ecd333d565872aab11d373225ceac8468"} Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.384162 4778 scope.go:117] "RemoveContainer" containerID="5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.384165 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t74gc" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.421272 4778 scope.go:117] "RemoveContainer" containerID="fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.431064 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerName="oauth-openshift" containerID="cri-o://fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8" gracePeriod=15 Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.445121 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.459782 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t74gc"] Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.468690 4778 scope.go:117] "RemoveContainer" containerID="041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.497056 4778 scope.go:117] "RemoveContainer" containerID="5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568" Mar 18 09:07:53 crc kubenswrapper[4778]: E0318 09:07:53.498289 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568\": container with ID starting with 5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568 not found: ID does not exist" containerID="5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.498362 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568"} err="failed to get container status \"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568\": rpc error: code = NotFound desc = could not find container \"5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568\": container with ID starting with 5541236d7fcac9ef11952b1a34e65d52d1d872cf16bc39fa16a1d79e9c524568 not found: ID does not exist" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.498410 4778 scope.go:117] "RemoveContainer" containerID="fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731" Mar 18 09:07:53 crc kubenswrapper[4778]: E0318 09:07:53.499000 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731\": container with ID starting with fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731 not found: ID does not exist" containerID="fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.499101 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731"} err="failed to get container status \"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731\": rpc error: code = NotFound desc = could not find container \"fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731\": container with ID starting with fdba8bf349a2a319ef44195469161c05ecbd9b8ac42cb9bcaedd36d2027ac731 not found: ID does not exist" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.499236 4778 scope.go:117] "RemoveContainer" containerID="041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118" Mar 18 09:07:53 crc kubenswrapper[4778]: E0318 09:07:53.500766 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118\": container with ID starting with 041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118 not found: ID does not exist" containerID="041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.500871 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118"} err="failed to get container status \"041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118\": rpc error: code = NotFound desc = could not find container \"041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118\": container with ID starting with 041c9f6647a0a4b33225ec4f168858c1a6b398621642bc3399fdfc7c98ca3118 not found: ID does not exist" Mar 18 09:07:53 crc kubenswrapper[4778]: I0318 09:07:53.950748 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015611 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015697 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015727 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015749 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015790 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015869 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015894 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015928 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015958 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.015985 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.016004 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.016031 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.016059 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxpv5\" (UniqueName: \"kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.016079 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig\") pod \"db81860d-bcb7-4a56-a935-544dbc4be29b\" (UID: \"db81860d-bcb7-4a56-a935-544dbc4be29b\") " Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.017172 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.017331 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.018766 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.018790 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.020138 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.020233 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.021861 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.022334 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.023184 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.023411 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.024980 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.025445 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.025738 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.026223 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.026929 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.040112 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5" (OuterVolumeSpecName: "kube-api-access-jxpv5") pod "db81860d-bcb7-4a56-a935-544dbc4be29b" (UID: "db81860d-bcb7-4a56-a935-544dbc4be29b"). InnerVolumeSpecName "kube-api-access-jxpv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120036 4778 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db81860d-bcb7-4a56-a935-544dbc4be29b-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120080 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120302 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120312 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxpv5\" (UniqueName: \"kubernetes.io/projected/db81860d-bcb7-4a56-a935-544dbc4be29b-kube-api-access-jxpv5\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120321 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120330 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120342 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120353 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120364 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120375 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120385 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.120393 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db81860d-bcb7-4a56-a935-544dbc4be29b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.193601 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" path="/var/lib/kubelet/pods/01828fdf-ef1b-44e3-905b-aec0c6aaa44f/volumes" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.398095 4778 generic.go:334] "Generic (PLEG): container finished" podID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerID="fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8" exitCode=0 Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.398171 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.398171 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" event={"ID":"db81860d-bcb7-4a56-a935-544dbc4be29b","Type":"ContainerDied","Data":"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8"} Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.398313 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5dpv" event={"ID":"db81860d-bcb7-4a56-a935-544dbc4be29b","Type":"ContainerDied","Data":"233e58e62c8d40d87963329725284bd0d629e6646b095fe46a9712b711f0c101"} Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.398355 4778 scope.go:117] "RemoveContainer" containerID="fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.429921 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.435238 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5dpv"] Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.444814 4778 scope.go:117] "RemoveContainer" containerID="fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8" Mar 18 09:07:54 crc kubenswrapper[4778]: E0318 09:07:54.445176 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8\": container with ID starting with fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8 not found: ID does not exist" containerID="fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8" Mar 18 09:07:54 crc kubenswrapper[4778]: I0318 09:07:54.445236 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8"} err="failed to get container status \"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8\": rpc error: code = NotFound desc = could not find container \"fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8\": container with ID starting with fa0c5744c3ea87fe64d11a272158d4455e346b62ab0a20a422af364f3cebdfd8 not found: ID does not exist" Mar 18 09:07:55 crc kubenswrapper[4778]: I0318 09:07:55.674076 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:55 crc kubenswrapper[4778]: I0318 09:07:55.674516 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" podUID="532bf41b-51fb-4815-ab26-8fb2d12526d2" containerName="controller-manager" containerID="cri-o://8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064" gracePeriod=30 Mar 18 09:07:55 crc kubenswrapper[4778]: I0318 09:07:55.744790 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:55 crc kubenswrapper[4778]: I0318 09:07:55.745008 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" podUID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" containerName="route-controller-manager" containerID="cri-o://505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9" gracePeriod=30 Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.195649 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" path="/var/lib/kubelet/pods/db81860d-bcb7-4a56-a935-544dbc4be29b/volumes" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.243873 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.248228 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357055 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles\") pod \"532bf41b-51fb-4815-ab26-8fb2d12526d2\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357181 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert\") pod \"532bf41b-51fb-4815-ab26-8fb2d12526d2\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357249 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26k95\" (UniqueName: \"kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95\") pod \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357300 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsn88\" (UniqueName: \"kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88\") pod \"532bf41b-51fb-4815-ab26-8fb2d12526d2\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357341 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config\") pod \"532bf41b-51fb-4815-ab26-8fb2d12526d2\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357380 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config\") pod \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert\") pod \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357452 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca\") pod \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\" (UID: \"4aae8a16-f704-4764-bfd7-7a0cfed2eee3\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.357472 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca\") pod \"532bf41b-51fb-4815-ab26-8fb2d12526d2\" (UID: \"532bf41b-51fb-4815-ab26-8fb2d12526d2\") " Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.358598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "532bf41b-51fb-4815-ab26-8fb2d12526d2" (UID: "532bf41b-51fb-4815-ab26-8fb2d12526d2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.358674 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca" (OuterVolumeSpecName: "client-ca") pod "532bf41b-51fb-4815-ab26-8fb2d12526d2" (UID: "532bf41b-51fb-4815-ab26-8fb2d12526d2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.358798 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config" (OuterVolumeSpecName: "config") pod "532bf41b-51fb-4815-ab26-8fb2d12526d2" (UID: "532bf41b-51fb-4815-ab26-8fb2d12526d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.358926 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca" (OuterVolumeSpecName: "client-ca") pod "4aae8a16-f704-4764-bfd7-7a0cfed2eee3" (UID: "4aae8a16-f704-4764-bfd7-7a0cfed2eee3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.359057 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config" (OuterVolumeSpecName: "config") pod "4aae8a16-f704-4764-bfd7-7a0cfed2eee3" (UID: "4aae8a16-f704-4764-bfd7-7a0cfed2eee3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.362726 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4aae8a16-f704-4764-bfd7-7a0cfed2eee3" (UID: "4aae8a16-f704-4764-bfd7-7a0cfed2eee3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.363036 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88" (OuterVolumeSpecName: "kube-api-access-xsn88") pod "532bf41b-51fb-4815-ab26-8fb2d12526d2" (UID: "532bf41b-51fb-4815-ab26-8fb2d12526d2"). InnerVolumeSpecName "kube-api-access-xsn88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.363169 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "532bf41b-51fb-4815-ab26-8fb2d12526d2" (UID: "532bf41b-51fb-4815-ab26-8fb2d12526d2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.364766 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95" (OuterVolumeSpecName: "kube-api-access-26k95") pod "4aae8a16-f704-4764-bfd7-7a0cfed2eee3" (UID: "4aae8a16-f704-4764-bfd7-7a0cfed2eee3"). InnerVolumeSpecName "kube-api-access-26k95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.424181 4778 generic.go:334] "Generic (PLEG): container finished" podID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" containerID="505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9" exitCode=0 Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.424264 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" event={"ID":"4aae8a16-f704-4764-bfd7-7a0cfed2eee3","Type":"ContainerDied","Data":"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9"} Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.424290 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.424336 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w" event={"ID":"4aae8a16-f704-4764-bfd7-7a0cfed2eee3","Type":"ContainerDied","Data":"a7918f18849ba2e79d4ef72f4ad9306fd3cd739ca9880924f05fcd02b77c8d3d"} Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.424369 4778 scope.go:117] "RemoveContainer" containerID="505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.426420 4778 generic.go:334] "Generic (PLEG): container finished" podID="532bf41b-51fb-4815-ab26-8fb2d12526d2" containerID="8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064" exitCode=0 Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.426458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" event={"ID":"532bf41b-51fb-4815-ab26-8fb2d12526d2","Type":"ContainerDied","Data":"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064"} Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.426497 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" event={"ID":"532bf41b-51fb-4815-ab26-8fb2d12526d2","Type":"ContainerDied","Data":"2f81410394a4a7c6f801aac32f2e683ca0e93e74a4322b2f6a48fc440a4c2e61"} Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.426572 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66b77f46fb-47b5r" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.450603 4778 scope.go:117] "RemoveContainer" containerID="505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.451356 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9\": container with ID starting with 505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9 not found: ID does not exist" containerID="505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.451473 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9"} err="failed to get container status \"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9\": rpc error: code = NotFound desc = could not find container \"505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9\": container with ID starting with 505bf76bcadcc4092c083c7c6ee8d0f286528746689d8b6d08ee4c5b23a610e9 not found: ID does not exist" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.451579 4778 scope.go:117] "RemoveContainer" containerID="8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458853 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458899 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532bf41b-51fb-4815-ab26-8fb2d12526d2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458918 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26k95\" (UniqueName: \"kubernetes.io/projected/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-kube-api-access-26k95\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458939 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsn88\" (UniqueName: \"kubernetes.io/projected/532bf41b-51fb-4815-ab26-8fb2d12526d2-kube-api-access-xsn88\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458958 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458974 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.458989 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.459005 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4aae8a16-f704-4764-bfd7-7a0cfed2eee3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.459022 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/532bf41b-51fb-4815-ab26-8fb2d12526d2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.460352 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.462949 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69f98775dd-rk74w"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.474892 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.478650 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66b77f46fb-47b5r"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.485024 4778 scope.go:117] "RemoveContainer" containerID="8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.485652 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064\": container with ID starting with 8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064 not found: ID does not exist" containerID="8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.485783 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064"} err="failed to get container status \"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064\": rpc error: code = NotFound desc = could not find container \"8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064\": container with ID starting with 8457ad72c75f00b0eaed06b38f1c93b1e9d33c3b1d788391f81dc9f60c06c064 not found: ID does not exist" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.835311 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z"] Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.836643 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532bf41b-51fb-4815-ab26-8fb2d12526d2" containerName="controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.836714 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="532bf41b-51fb-4815-ab26-8fb2d12526d2" containerName="controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.836772 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="extract-content" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.836825 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="extract-content" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.836891 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="extract-utilities" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.836949 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="extract-utilities" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837016 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="extract-content" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837074 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="extract-content" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837126 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerName="oauth-openshift" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837176 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerName="oauth-openshift" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837254 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837331 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837407 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="extract-utilities" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837461 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="extract-utilities" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837514 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" containerName="route-controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837564 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" containerName="route-controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: E0318 09:07:56.837618 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837668 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837819 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aedbf59-d23d-409e-9742-09824ed6ef2a" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837880 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="db81860d-bcb7-4a56-a935-544dbc4be29b" containerName="oauth-openshift" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837937 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="532bf41b-51fb-4815-ab26-8fb2d12526d2" containerName="controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.837996 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="01828fdf-ef1b-44e3-905b-aec0c6aaa44f" containerName="registry-server" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.838053 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" containerName="route-controller-manager" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.838400 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.838535 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.838860 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-67556b9b9b-7qmr6"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.839132 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.839542 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843215 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843327 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843385 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843474 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843484 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.843544 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844068 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844329 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844438 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844135 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844139 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.844184 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.845989 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.846008 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.846221 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.846311 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.846413 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.846770 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.847561 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.847815 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.848730 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.849262 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.850284 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.852655 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.858792 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.861987 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.868129 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-67556b9b9b-7qmr6"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.880184 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.891633 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.898266 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.903446 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp"] Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965635 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-error\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965705 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-session\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965734 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-proxy-ca-bundles\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965757 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965813 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965834 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-dir\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.965973 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966013 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-config\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966165 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq69g\" (UniqueName: \"kubernetes.io/projected/bb3a9066-971e-467b-bb54-8ba8b720781e-kube-api-access-mq69g\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966186 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966277 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-client-ca\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966317 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvdq9\" (UniqueName: \"kubernetes.io/projected/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-kube-api-access-pvdq9\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a393524-f81f-4ff5-b836-29188770f717-serving-cert\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-login\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966402 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltjkk\" (UniqueName: \"kubernetes.io/projected/6a393524-f81f-4ff5-b836-29188770f717-kube-api-access-ltjkk\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966432 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-client-ca\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966509 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb3a9066-971e-467b-bb54-8ba8b720781e-serving-cert\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-config\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966690 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966752 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:56 crc kubenswrapper[4778]: I0318 09:07:56.966784 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-policies\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068456 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-client-ca\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068514 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvdq9\" (UniqueName: \"kubernetes.io/projected/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-kube-api-access-pvdq9\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068536 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a393524-f81f-4ff5-b836-29188770f717-serving-cert\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068557 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-login\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068576 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltjkk\" (UniqueName: \"kubernetes.io/projected/6a393524-f81f-4ff5-b836-29188770f717-kube-api-access-ltjkk\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068594 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-client-ca\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068613 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb3a9066-971e-467b-bb54-8ba8b720781e-serving-cert\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068630 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-config\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068651 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-policies\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068707 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-error\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068730 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-session\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068747 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-proxy-ca-bundles\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068767 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068786 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068807 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-dir\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068823 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068862 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068879 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-config\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq69g\" (UniqueName: \"kubernetes.io/projected/bb3a9066-971e-467b-bb54-8ba8b720781e-kube-api-access-mq69g\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.068935 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.069673 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.070276 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-client-ca\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.071560 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-service-ca\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.072317 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-config\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.072704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.072733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-dir\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.073906 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb3a9066-971e-467b-bb54-8ba8b720781e-proxy-ca-bundles\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.074353 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.074628 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.074775 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a393524-f81f-4ff5-b836-29188770f717-serving-cert\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.075047 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-client-ca\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.075125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-session\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.075683 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-audit-policies\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.076490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-login\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.076552 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a393524-f81f-4ff5-b836-29188770f717-config\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.077220 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-system-router-certs\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.079303 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.080554 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-template-error\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.086549 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb3a9066-971e-467b-bb54-8ba8b720781e-serving-cert\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.089817 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.094254 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvdq9\" (UniqueName: \"kubernetes.io/projected/06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec-kube-api-access-pvdq9\") pod \"oauth-openshift-67556b9b9b-7qmr6\" (UID: \"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec\") " pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.101574 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq69g\" (UniqueName: \"kubernetes.io/projected/bb3a9066-971e-467b-bb54-8ba8b720781e-kube-api-access-mq69g\") pod \"controller-manager-7585d5b4f9-5ltpp\" (UID: \"bb3a9066-971e-467b-bb54-8ba8b720781e\") " pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.101577 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltjkk\" (UniqueName: \"kubernetes.io/projected/6a393524-f81f-4ff5-b836-29188770f717-kube-api-access-ltjkk\") pod \"route-controller-manager-659f498476-jmh9z\" (UID: \"6a393524-f81f-4ff5-b836-29188770f717\") " pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.158878 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.186814 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.199779 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.424413 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z"] Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.684317 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-67556b9b9b-7qmr6"] Mar 18 09:07:57 crc kubenswrapper[4778]: I0318 09:07:57.733272 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp"] Mar 18 09:07:57 crc kubenswrapper[4778]: W0318 09:07:57.748064 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb3a9066_971e_467b_bb54_8ba8b720781e.slice/crio-1b681372cb607effc0785cfbf4328cc38b50f46b450f3104cbd0a9baf4f13f20 WatchSource:0}: Error finding container 1b681372cb607effc0785cfbf4328cc38b50f46b450f3104cbd0a9baf4f13f20: Status 404 returned error can't find the container with id 1b681372cb607effc0785cfbf4328cc38b50f46b450f3104cbd0a9baf4f13f20 Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.196086 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aae8a16-f704-4764-bfd7-7a0cfed2eee3" path="/var/lib/kubelet/pods/4aae8a16-f704-4764-bfd7-7a0cfed2eee3/volumes" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.197418 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532bf41b-51fb-4815-ab26-8fb2d12526d2" path="/var/lib/kubelet/pods/532bf41b-51fb-4815-ab26-8fb2d12526d2/volumes" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.453323 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" event={"ID":"6a393524-f81f-4ff5-b836-29188770f717","Type":"ContainerStarted","Data":"4f421a15a28224b1ea94119a9d5502c90bb61897e2bd67989de460b3e188a852"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.453388 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" event={"ID":"6a393524-f81f-4ff5-b836-29188770f717","Type":"ContainerStarted","Data":"770bb3bdcd9b98c73620ddf0e1e1681db38f82724a883f7e2649d58a06005318"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.453696 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.455858 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" event={"ID":"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec","Type":"ContainerStarted","Data":"c81a3d973fd0941efd92be1e169e4358139cbc96cb0f800d5731165eeeddacb6"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.455906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" event={"ID":"06f26f4e-d29e-4cb7-9c6f-990cc8cd14ec","Type":"ContainerStarted","Data":"c7d26979f02bd6139800747cd734de010dde8aa3b6ff6027b691b8bd7e548820"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.456082 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.457800 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" event={"ID":"bb3a9066-971e-467b-bb54-8ba8b720781e","Type":"ContainerStarted","Data":"8d81186147e1ac52f0e6fd8ba5b0f75252b9187d8d5cdf7266a08aeaa2bd6933"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.457865 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" event={"ID":"bb3a9066-971e-467b-bb54-8ba8b720781e","Type":"ContainerStarted","Data":"1b681372cb607effc0785cfbf4328cc38b50f46b450f3104cbd0a9baf4f13f20"} Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.458078 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.461817 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.465536 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.482284 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-659f498476-jmh9z" podStartSLOduration=3.482263398 podStartE2EDuration="3.482263398s" podCreationTimestamp="2026-03-18 09:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:58.480016987 +0000 UTC m=+345.054761837" watchObservedRunningTime="2026-03-18 09:07:58.482263398 +0000 UTC m=+345.057008238" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.549573 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" podStartSLOduration=30.549558374 podStartE2EDuration="30.549558374s" podCreationTimestamp="2026-03-18 09:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:58.524664719 +0000 UTC m=+345.099409579" watchObservedRunningTime="2026-03-18 09:07:58.549558374 +0000 UTC m=+345.124303214" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.550441 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7585d5b4f9-5ltpp" podStartSLOduration=3.550436698 podStartE2EDuration="3.550436698s" podCreationTimestamp="2026-03-18 09:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:07:58.547375015 +0000 UTC m=+345.122119845" watchObservedRunningTime="2026-03-18 09:07:58.550436698 +0000 UTC m=+345.125181538" Mar 18 09:07:58 crc kubenswrapper[4778]: I0318 09:07:58.845832 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-67556b9b9b-7qmr6" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.144327 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563748-8q2hs"] Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.148156 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.150887 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-8q2hs"] Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.156589 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.156885 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.157151 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.234518 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx4kt\" (UniqueName: \"kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt\") pod \"auto-csr-approver-29563748-8q2hs\" (UID: \"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126\") " pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.337149 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx4kt\" (UniqueName: \"kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt\") pod \"auto-csr-approver-29563748-8q2hs\" (UID: \"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126\") " pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.369403 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx4kt\" (UniqueName: \"kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt\") pod \"auto-csr-approver-29563748-8q2hs\" (UID: \"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126\") " pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.466271 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:00 crc kubenswrapper[4778]: I0318 09:08:00.933528 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-8q2hs"] Mar 18 09:08:00 crc kubenswrapper[4778]: W0318 09:08:00.939411 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d4f1c72_13f2_47ff_94aa_9b4e91f2e126.slice/crio-85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129 WatchSource:0}: Error finding container 85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129: Status 404 returned error can't find the container with id 85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129 Mar 18 09:08:01 crc kubenswrapper[4778]: I0318 09:08:01.484353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" event={"ID":"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126","Type":"ContainerStarted","Data":"85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129"} Mar 18 09:08:02 crc kubenswrapper[4778]: I0318 09:08:02.502524 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" event={"ID":"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126","Type":"ContainerStarted","Data":"8bac8cffd4e1a60ba2bf0f1dd076bc9102bdfd1bb1f8e85a389ddde5e582bd3e"} Mar 18 09:08:02 crc kubenswrapper[4778]: I0318 09:08:02.523137 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" podStartSLOduration=1.472313908 podStartE2EDuration="2.5231159s" podCreationTimestamp="2026-03-18 09:08:00 +0000 UTC" firstStartedPulling="2026-03-18 09:08:00.942108182 +0000 UTC m=+347.516853022" lastFinishedPulling="2026-03-18 09:08:01.992910144 +0000 UTC m=+348.567655014" observedRunningTime="2026-03-18 09:08:02.521178158 +0000 UTC m=+349.095922998" watchObservedRunningTime="2026-03-18 09:08:02.5231159 +0000 UTC m=+349.097860760" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.523152 4778 generic.go:334] "Generic (PLEG): container finished" podID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" containerID="8bac8cffd4e1a60ba2bf0f1dd076bc9102bdfd1bb1f8e85a389ddde5e582bd3e" exitCode=0 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.523220 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" event={"ID":"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126","Type":"ContainerDied","Data":"8bac8cffd4e1a60ba2bf0f1dd076bc9102bdfd1bb1f8e85a389ddde5e582bd3e"} Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.590879 4778 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.592312 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.631115 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.689851 4778 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.690151 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf" gracePeriod=15 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.690321 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8" gracePeriod=15 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.690369 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5" gracePeriod=15 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.690442 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b" gracePeriod=15 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.690427 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e" gracePeriod=15 Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.693626 4778 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.693995 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694425 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694448 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694464 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694478 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694491 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694511 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694525 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694582 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694599 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694618 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694632 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694649 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694662 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694682 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694695 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.694715 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694728 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694905 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694923 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694942 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694960 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.694982 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.695002 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.695016 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.695110 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.695411 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.695433 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.695606 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.700588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.700642 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.700690 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.700727 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.700750 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802678 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802795 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802845 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802927 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802927 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.802997 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803020 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803078 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803372 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803442 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803584 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.803987 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904619 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904692 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904798 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904826 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.904832 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: I0318 09:08:03.929329 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:03 crc kubenswrapper[4778]: W0318 09:08:03.952623 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-fab2138804c88ae433897923ae9e791bf072578ea0fdaf6820e27a9aeaed9215 WatchSource:0}: Error finding container fab2138804c88ae433897923ae9e791bf072578ea0fdaf6820e27a9aeaed9215: Status 404 returned error can't find the container with id fab2138804c88ae433897923ae9e791bf072578ea0fdaf6820e27a9aeaed9215 Mar 18 09:08:03 crc kubenswrapper[4778]: E0318 09:08:03.956722 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.70:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189de45575f14002 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:08:03.955941378 +0000 UTC m=+350.530686228,LastTimestamp:2026-03-18 09:08:03.955941378 +0000 UTC m=+350.530686228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.193492 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.193991 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.194350 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.548885 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.551084 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.552773 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8" exitCode=0 Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.552804 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b" exitCode=0 Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.552817 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e" exitCode=0 Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.552830 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5" exitCode=2 Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.552883 4778 scope.go:117] "RemoveContainer" containerID="721733eeeb61234b38380978380dc5dba12e92039e6d762d1593e72e950e6f0e" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.560188 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c87368359c8add64535225d3bbfde067e9b4f8b557eb6da52467546b92f78032"} Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.560251 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fab2138804c88ae433897923ae9e791bf072578ea0fdaf6820e27a9aeaed9215"} Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.561263 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.562022 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.564818 4778 generic.go:334] "Generic (PLEG): container finished" podID="93723e0d-2243-4390-a667-8a080325205f" containerID="e2b50472335d82d0cbf47f1b30666ae7b50519462f89aadf15ef33cf94804236" exitCode=0 Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.565039 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"93723e0d-2243-4390-a667-8a080325205f","Type":"ContainerDied","Data":"e2b50472335d82d0cbf47f1b30666ae7b50519462f89aadf15ef33cf94804236"} Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.565652 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.566405 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.566960 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.923880 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.924576 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.925107 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:04 crc kubenswrapper[4778]: I0318 09:08:04.925372 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.020078 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx4kt\" (UniqueName: \"kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt\") pod \"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126\" (UID: \"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126\") " Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.029406 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt" (OuterVolumeSpecName: "kube-api-access-sx4kt") pod "5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" (UID: "5d4f1c72-13f2-47ff-94aa-9b4e91f2e126"). InnerVolumeSpecName "kube-api-access-sx4kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.123317 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx4kt\" (UniqueName: \"kubernetes.io/projected/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126-kube-api-access-sx4kt\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.581759 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.585518 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" event={"ID":"5d4f1c72-13f2-47ff-94aa-9b4e91f2e126","Type":"ContainerDied","Data":"85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129"} Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.585568 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85aa2ca3e5023179d42775d87caa49a9a68b58d0ff03b6202e5dd6a676230129" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.585565 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.605892 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.606568 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:05 crc kubenswrapper[4778]: I0318 09:08:05.607125 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.172472 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.173357 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.173575 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.173796 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.178134 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.178904 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.179225 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.179435 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.179722 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.179954 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.253336 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access\") pod \"93723e0d-2243-4390-a667-8a080325205f\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.253479 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir\") pod \"93723e0d-2243-4390-a667-8a080325205f\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.253556 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock\") pod \"93723e0d-2243-4390-a667-8a080325205f\" (UID: \"93723e0d-2243-4390-a667-8a080325205f\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.253731 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "93723e0d-2243-4390-a667-8a080325205f" (UID: "93723e0d-2243-4390-a667-8a080325205f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.253815 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock" (OuterVolumeSpecName: "var-lock") pod "93723e0d-2243-4390-a667-8a080325205f" (UID: "93723e0d-2243-4390-a667-8a080325205f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.261390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "93723e0d-2243-4390-a667-8a080325205f" (UID: "93723e0d-2243-4390-a667-8a080325205f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355598 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355739 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355783 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355853 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355854 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.355870 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356578 4778 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356608 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356628 4778 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93723e0d-2243-4390-a667-8a080325205f-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356645 4778 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356663 4778 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.356680 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93723e0d-2243-4390-a667-8a080325205f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.593917 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"93723e0d-2243-4390-a667-8a080325205f","Type":"ContainerDied","Data":"e6f7bb7ae014c0c468ab5d5f0830ec452ff1c1962710492d6af9f507d40acc53"} Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.593999 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6f7bb7ae014c0c468ab5d5f0830ec452ff1c1962710492d6af9f507d40acc53" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.593964 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.599178 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.599914 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf" exitCode=0 Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.599984 4778 scope.go:117] "RemoveContainer" containerID="0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.600177 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.600984 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.601295 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.601793 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.602427 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.621522 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.621785 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.622266 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.623243 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.624392 4778 scope.go:117] "RemoveContainer" containerID="72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.631032 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.631908 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.632739 4778 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.633183 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.642125 4778 scope.go:117] "RemoveContainer" containerID="e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.661729 4778 scope.go:117] "RemoveContainer" containerID="82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.677659 4778 scope.go:117] "RemoveContainer" containerID="3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.699887 4778 scope.go:117] "RemoveContainer" containerID="7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.731306 4778 scope.go:117] "RemoveContainer" containerID="0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.732020 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\": container with ID starting with 0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8 not found: ID does not exist" containerID="0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.732071 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8"} err="failed to get container status \"0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\": rpc error: code = NotFound desc = could not find container \"0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8\": container with ID starting with 0e8cdaeb1cebd7cf820de79712cf1f9788f29c35abe81298d48acacfb7f2fcc8 not found: ID does not exist" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.732108 4778 scope.go:117] "RemoveContainer" containerID="72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.732842 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\": container with ID starting with 72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b not found: ID does not exist" containerID="72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.732913 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b"} err="failed to get container status \"72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\": rpc error: code = NotFound desc = could not find container \"72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b\": container with ID starting with 72d6b9d4f30291cf1a93ed613e2032a030d9145f2227089e452a254cafdf901b not found: ID does not exist" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.732954 4778 scope.go:117] "RemoveContainer" containerID="e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.734606 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\": container with ID starting with e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e not found: ID does not exist" containerID="e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.734677 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e"} err="failed to get container status \"e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\": rpc error: code = NotFound desc = could not find container \"e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e\": container with ID starting with e00cb515e3942e87c42b990beb37ee5e5954e269ac6770e464f4ad15bc765b3e not found: ID does not exist" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.734724 4778 scope.go:117] "RemoveContainer" containerID="82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.735291 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\": container with ID starting with 82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5 not found: ID does not exist" containerID="82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.735342 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5"} err="failed to get container status \"82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\": rpc error: code = NotFound desc = could not find container \"82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5\": container with ID starting with 82e2dbeac6943da95bd955d4ac9501528528d2db99cd6ceeb1a111d9d27aa7e5 not found: ID does not exist" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.735373 4778 scope.go:117] "RemoveContainer" containerID="3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.736140 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\": container with ID starting with 3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf not found: ID does not exist" containerID="3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.736241 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf"} err="failed to get container status \"3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\": rpc error: code = NotFound desc = could not find container \"3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf\": container with ID starting with 3a53b37498ca3d03543f0912eb5d2ca1f5354400509d6a6b254257cafadd1acf not found: ID does not exist" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.736296 4778 scope.go:117] "RemoveContainer" containerID="7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88" Mar 18 09:08:06 crc kubenswrapper[4778]: E0318 09:08:06.736958 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\": container with ID starting with 7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88 not found: ID does not exist" containerID="7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88" Mar 18 09:08:06 crc kubenswrapper[4778]: I0318 09:08:06.737011 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88"} err="failed to get container status \"7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\": rpc error: code = NotFound desc = could not find container \"7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88\": container with ID starting with 7f16dd91f7d9689ca1f8f8d05a2e5fe081d12841dc073d280cb7e118d83d7d88 not found: ID does not exist" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.036608 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.037832 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.038442 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.038849 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.039362 4778 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:08 crc kubenswrapper[4778]: I0318 09:08:08.039417 4778 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.039807 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="200ms" Mar 18 09:08:08 crc kubenswrapper[4778]: I0318 09:08:08.199360 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.241291 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="400ms" Mar 18 09:08:08 crc kubenswrapper[4778]: E0318 09:08:08.643043 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="800ms" Mar 18 09:08:09 crc kubenswrapper[4778]: E0318 09:08:09.445371 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="1.6s" Mar 18 09:08:10 crc kubenswrapper[4778]: E0318 09:08:10.022679 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.70:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189de45575f14002 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 09:08:03.955941378 +0000 UTC m=+350.530686228,LastTimestamp:2026-03-18 09:08:03.955941378 +0000 UTC m=+350.530686228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 09:08:10 crc kubenswrapper[4778]: I0318 09:08:10.354493 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 09:08:10 crc kubenswrapper[4778]: I0318 09:08:10.355292 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:10 crc kubenswrapper[4778]: I0318 09:08:10.355916 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:10 crc kubenswrapper[4778]: I0318 09:08:10.356499 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:10 crc kubenswrapper[4778]: I0318 09:08:10.357176 4778 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:11 crc kubenswrapper[4778]: E0318 09:08:11.047353 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="3.2s" Mar 18 09:08:12 crc kubenswrapper[4778]: E0318 09:08:12.263562 4778 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.70:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" volumeName="registry-storage" Mar 18 09:08:14 crc kubenswrapper[4778]: I0318 09:08:14.190103 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:14 crc kubenswrapper[4778]: I0318 09:08:14.190801 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:14 crc kubenswrapper[4778]: I0318 09:08:14.191342 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:14 crc kubenswrapper[4778]: I0318 09:08:14.191821 4778 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:14 crc kubenswrapper[4778]: E0318 09:08:14.249239 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" interval="6.4s" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.186894 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.188391 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.189329 4778 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.189750 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.190161 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.204685 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.204714 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.205087 4778 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.205585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.702786 4778 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="15022c61f8f7848b9d5cb82bc7482c0948ac803924a28c75d30573c347fa2a6e" exitCode=0 Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.702910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"15022c61f8f7848b9d5cb82bc7482c0948ac803924a28c75d30573c347fa2a6e"} Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.703164 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"724e531da6d62f3549d123165683968c32317a115c58f5945d2904ceae2a7e48"} Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.703444 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.703457 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.703953 4778 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.704568 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.705457 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.706039 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.706304 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.706742 4778 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.708048 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.708121 4778 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54" exitCode=1 Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.708162 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54"} Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.708615 4778 scope.go:117] "RemoveContainer" containerID="c954ef6dc37083ef650932e566792aa228b1b2923c21061c319ca14cf3befd54" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.709286 4778 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.710269 4778 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.710891 4778 status_manager.go:851] "Failed to get status for pod" podUID="93723e0d-2243-4390-a667-8a080325205f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.711493 4778 status_manager.go:851] "Failed to get status for pod" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" pod="openshift-infra/auto-csr-approver-29563748-8q2hs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29563748-8q2hs\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.711860 4778 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.817241 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:08:18Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:08:18Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:08:18Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T09:08:18Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.817627 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.817821 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.818016 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.818316 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.70:6443: connect: connection refused" Mar 18 09:08:18 crc kubenswrapper[4778]: E0318 09:08:18.818348 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 09:08:18 crc kubenswrapper[4778]: I0318 09:08:18.956312 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.717845 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.719631 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.719820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae01dca2780474bc416cdfc4c7ca33c6504d1a3a39842df33f21a753ed00995f"} Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.725829 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0d59ab003f59ea57c623f253dd45a3b041f10ec06eb83b3aca1d3fae860942b9"} Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.726058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"44161b3eb080fd781ada3e0dece3563642ea32c5314cf590037a6c343e05fe59"} Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.726141 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aecc03e0c76ad5620aa8cc8df78dd74eca541bf7a499fb18e311a3efc8ed4311"} Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.726226 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2f198513c437443df041b1f482fe066eaa858c0b9ef973208727b6d4f47d9f0d"} Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.801667 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.802066 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 18 09:08:19 crc kubenswrapper[4778]: I0318 09:08:19.802224 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 18 09:08:20 crc kubenswrapper[4778]: I0318 09:08:20.733658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e4b8a3499cf6a41fd3046003b4b680c90778652113a61109425c607e0555b37c"} Mar 18 09:08:20 crc kubenswrapper[4778]: I0318 09:08:20.734111 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:20 crc kubenswrapper[4778]: I0318 09:08:20.734137 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:22 crc kubenswrapper[4778]: I0318 09:08:22.724060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:08:22 crc kubenswrapper[4778]: I0318 09:08:22.727673 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 09:08:22 crc kubenswrapper[4778]: I0318 09:08:22.743982 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2d5c312-2314-46d7-8ba2-64b621b0c2c7-metrics-certs\") pod \"network-metrics-daemon-9bc7s\" (UID: \"a2d5c312-2314-46d7-8ba2-64b621b0c2c7\") " pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.010926 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.019243 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bc7s" Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.206378 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.207076 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.212957 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:23 crc kubenswrapper[4778]: W0318 09:08:23.586528 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d5c312_2314_46d7_8ba2_64b621b0c2c7.slice/crio-503e2ac35b2cb29f79d200576b5f4c19c9e97c546766f821bb1db339e03c2980 WatchSource:0}: Error finding container 503e2ac35b2cb29f79d200576b5f4c19c9e97c546766f821bb1db339e03c2980: Status 404 returned error can't find the container with id 503e2ac35b2cb29f79d200576b5f4c19c9e97c546766f821bb1db339e03c2980 Mar 18 09:08:23 crc kubenswrapper[4778]: I0318 09:08:23.758849 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" event={"ID":"a2d5c312-2314-46d7-8ba2-64b621b0c2c7","Type":"ContainerStarted","Data":"503e2ac35b2cb29f79d200576b5f4c19c9e97c546766f821bb1db339e03c2980"} Mar 18 09:08:24 crc kubenswrapper[4778]: I0318 09:08:24.767962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" event={"ID":"a2d5c312-2314-46d7-8ba2-64b621b0c2c7","Type":"ContainerStarted","Data":"58c1742e2f94b007534f301b9155bdf564198ba49891bcec6ce75b6770dc5c77"} Mar 18 09:08:24 crc kubenswrapper[4778]: I0318 09:08:24.768653 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9bc7s" event={"ID":"a2d5c312-2314-46d7-8ba2-64b621b0c2c7","Type":"ContainerStarted","Data":"f533ae4bb7169f5404c27fbfc262f2519bafba67043e8afea14c0081c5c0e514"} Mar 18 09:08:25 crc kubenswrapper[4778]: I0318 09:08:25.762101 4778 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:25 crc kubenswrapper[4778]: I0318 09:08:25.935730 4778 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="047c238e-63c8-4a7c-9893-575a3a291f40" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.778950 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.779083 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.779115 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.782550 4778 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="047c238e-63c8-4a7c-9893-575a3a291f40" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.783318 4778 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://2f198513c437443df041b1f482fe066eaa858c0b9ef973208727b6d4f47d9f0d" Mar 18 09:08:26 crc kubenswrapper[4778]: I0318 09:08:26.783345 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:27 crc kubenswrapper[4778]: I0318 09:08:27.786089 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:27 crc kubenswrapper[4778]: I0318 09:08:27.786608 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:27 crc kubenswrapper[4778]: I0318 09:08:27.792845 4778 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="047c238e-63c8-4a7c-9893-575a3a291f40" Mar 18 09:08:28 crc kubenswrapper[4778]: I0318 09:08:28.133860 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:08:28 crc kubenswrapper[4778]: I0318 09:08:28.793264 4778 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:28 crc kubenswrapper[4778]: I0318 09:08:28.793303 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="38043922-fba8-4439-b469-508c00992f80" Mar 18 09:08:28 crc kubenswrapper[4778]: I0318 09:08:28.799622 4778 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="047c238e-63c8-4a7c-9893-575a3a291f40" Mar 18 09:08:29 crc kubenswrapper[4778]: I0318 09:08:29.807922 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:08:29 crc kubenswrapper[4778]: I0318 09:08:29.815027 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.147482 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.279095 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.390090 4778 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.549248 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.747945 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 09:08:36 crc kubenswrapper[4778]: I0318 09:08:36.759352 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.095961 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.107531 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.405834 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.723029 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.793748 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 09:08:37 crc kubenswrapper[4778]: I0318 09:08:37.812374 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.077914 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.086092 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.132185 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.144462 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.160541 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.190149 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.309470 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.381792 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.591005 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.610173 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.743346 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.847270 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.890535 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.897159 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.939859 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 09:08:38 crc kubenswrapper[4778]: I0318 09:08:38.964678 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.077254 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.222973 4778 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.226392 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.226365902 podStartE2EDuration="36.226365902s" podCreationTimestamp="2026-03-18 09:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:08:25.811572615 +0000 UTC m=+372.386317465" watchObservedRunningTime="2026-03-18 09:08:39.226365902 +0000 UTC m=+385.801110782" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.229638 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9bc7s" podStartSLOduration=315.22962009 podStartE2EDuration="5m15.22962009s" podCreationTimestamp="2026-03-18 09:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:08:25.907273192 +0000 UTC m=+372.482018052" watchObservedRunningTime="2026-03-18 09:08:39.22962009 +0000 UTC m=+385.804364970" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.231892 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.231972 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.232022 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9bc7s"] Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.235963 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.252001 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.255300 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.276336 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.276301607 podStartE2EDuration="14.276301607s" podCreationTimestamp="2026-03-18 09:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:08:39.266931343 +0000 UTC m=+385.841676193" watchObservedRunningTime="2026-03-18 09:08:39.276301607 +0000 UTC m=+385.851046497" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.409424 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.524144 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.636653 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.643589 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.657755 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.745515 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.767032 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.836962 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.881880 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.920957 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 09:08:39 crc kubenswrapper[4778]: I0318 09:08:39.965772 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.183154 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.319125 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.326981 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.387689 4778 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.428089 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.492432 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.545630 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.559059 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.586945 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.597723 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.598618 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.622871 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.712092 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.733471 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.788766 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.811155 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 09:08:40 crc kubenswrapper[4778]: I0318 09:08:40.906773 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.011127 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.047722 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.112265 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.125038 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.127790 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.213991 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.440661 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.665692 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.747412 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.750644 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.782050 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.782347 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.796191 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.851758 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.867684 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.935822 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 09:08:41 crc kubenswrapper[4778]: I0318 09:08:41.984923 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.015778 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.027409 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.118179 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.174812 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.193377 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.213872 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.261646 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.359289 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.404245 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.413658 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.449550 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.474868 4778 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.498272 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.662653 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.677086 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.713825 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.714188 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.776336 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.917139 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 09:08:42 crc kubenswrapper[4778]: I0318 09:08:42.929972 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.083334 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.092776 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.177412 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.207093 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.241804 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.335733 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.338864 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.396615 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.426449 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.465098 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.491952 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.528540 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.583522 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.604260 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.660102 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.738930 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.739918 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.758946 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.805690 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 09:08:43 crc kubenswrapper[4778]: I0318 09:08:43.891003 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.029661 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.051145 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.053045 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.059860 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.062659 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.101968 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.122803 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.316653 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.341896 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.368468 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.385101 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.597112 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.603891 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.609064 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.642755 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.648167 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.665790 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.732937 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.750065 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.750315 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.949107 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 09:08:44 crc kubenswrapper[4778]: I0318 09:08:44.980969 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.017046 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.031341 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.031548 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.107757 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.110712 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.157711 4778 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.177018 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.274178 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.284186 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.368273 4778 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.410828 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.476777 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.571413 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.580081 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.602441 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.621539 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.698884 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.743316 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.789412 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.824424 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 09:08:45 crc kubenswrapper[4778]: I0318 09:08:45.931762 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.020474 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.156391 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.162790 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.211501 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.264750 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.283381 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.298499 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.439007 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.459839 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.461460 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.465933 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.479287 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.699139 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.705144 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.793408 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.828103 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.862651 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.874606 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 09:08:46 crc kubenswrapper[4778]: I0318 09:08:46.902558 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.043586 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.046510 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.105273 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.134693 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.177447 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.177881 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.189374 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.227697 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.250662 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.259563 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.281009 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.327756 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.399997 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.534747 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.546055 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.709808 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.759771 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.860046 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.881659 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.904263 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 09:08:47 crc kubenswrapper[4778]: I0318 09:08:47.994278 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.067486 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.211231 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.242045 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.245759 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.280152 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.331830 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.400347 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.473396 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.476892 4778 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.477257 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c87368359c8add64535225d3bbfde067e9b4f8b557eb6da52467546b92f78032" gracePeriod=5 Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.608627 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.660090 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.674384 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.682685 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.709180 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.820537 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.841143 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 09:08:48 crc kubenswrapper[4778]: I0318 09:08:48.961505 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.041244 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.134645 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.248876 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.275903 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.420794 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.430000 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.549483 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.622915 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.748638 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.832859 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.873416 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 09:08:49 crc kubenswrapper[4778]: I0318 09:08:49.946551 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.058553 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.137153 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.206160 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.380055 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.630812 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.732746 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.858347 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.882971 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 09:08:50 crc kubenswrapper[4778]: I0318 09:08:50.948608 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.056326 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.063443 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.124734 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.207148 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.236345 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.259639 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.351941 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.393275 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.742828 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 09:08:51 crc kubenswrapper[4778]: I0318 09:08:51.943975 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 09:08:52 crc kubenswrapper[4778]: I0318 09:08:52.547402 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 09:08:52 crc kubenswrapper[4778]: I0318 09:08:52.636500 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 09:08:53 crc kubenswrapper[4778]: I0318 09:08:53.974944 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 09:08:53 crc kubenswrapper[4778]: I0318 09:08:53.975351 4778 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c87368359c8add64535225d3bbfde067e9b4f8b557eb6da52467546b92f78032" exitCode=137 Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.051423 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.051536 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.158692 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.195462 4778 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.207354 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.207392 4778 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8c190e84-de5c-4ee5-9016-8b1ef240b359" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.211220 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.211273 4778 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8c190e84-de5c-4ee5-9016-8b1ef240b359" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.212893 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213328 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213426 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213471 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213476 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213574 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213617 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.213735 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.214492 4778 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.214529 4778 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.214547 4778 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.215237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.227179 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.315105 4778 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.315144 4778 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.984406 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.984845 4778 scope.go:117] "RemoveContainer" containerID="c87368359c8add64535225d3bbfde067e9b4f8b557eb6da52467546b92f78032" Mar 18 09:08:54 crc kubenswrapper[4778]: I0318 09:08:54.984993 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 09:08:56 crc kubenswrapper[4778]: I0318 09:08:56.198007 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 09:09:07 crc kubenswrapper[4778]: I0318 09:09:07.062876 4778 generic.go:334] "Generic (PLEG): container finished" podID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerID="90afc0123cf376859383c4f842d0587910af2413023c90f78391ff8bd752a9d6" exitCode=0 Mar 18 09:09:07 crc kubenswrapper[4778]: I0318 09:09:07.062983 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerDied","Data":"90afc0123cf376859383c4f842d0587910af2413023c90f78391ff8bd752a9d6"} Mar 18 09:09:07 crc kubenswrapper[4778]: I0318 09:09:07.064405 4778 scope.go:117] "RemoveContainer" containerID="90afc0123cf376859383c4f842d0587910af2413023c90f78391ff8bd752a9d6" Mar 18 09:09:08 crc kubenswrapper[4778]: I0318 09:09:08.070935 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerStarted","Data":"f259658053fd1d484054ae2d51122a4e3408bb84a6c07568bf45f82cee5346ab"} Mar 18 09:09:08 crc kubenswrapper[4778]: I0318 09:09:08.071527 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:09:08 crc kubenswrapper[4778]: I0318 09:09:08.073874 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.141046 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563750-lv4gn"] Mar 18 09:10:00 crc kubenswrapper[4778]: E0318 09:10:00.141791 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93723e0d-2243-4390-a667-8a080325205f" containerName="installer" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.141802 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="93723e0d-2243-4390-a667-8a080325205f" containerName="installer" Mar 18 09:10:00 crc kubenswrapper[4778]: E0318 09:10:00.141814 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" containerName="oc" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.141819 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" containerName="oc" Mar 18 09:10:00 crc kubenswrapper[4778]: E0318 09:10:00.141827 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.141832 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.142009 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" containerName="oc" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.142020 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="93723e0d-2243-4390-a667-8a080325205f" containerName="installer" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.142028 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.142516 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.144727 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.147117 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.147229 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.147403 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.149418 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.152266 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-lv4gn"] Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.252065 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4btw\" (UniqueName: \"kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw\") pod \"auto-csr-approver-29563750-lv4gn\" (UID: \"105b6b5d-09f6-48c8-862e-c17526c6d6c7\") " pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.353829 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4btw\" (UniqueName: \"kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw\") pod \"auto-csr-approver-29563750-lv4gn\" (UID: \"105b6b5d-09f6-48c8-862e-c17526c6d6c7\") " pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.385852 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4btw\" (UniqueName: \"kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw\") pod \"auto-csr-approver-29563750-lv4gn\" (UID: \"105b6b5d-09f6-48c8-862e-c17526c6d6c7\") " pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.465493 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:00 crc kubenswrapper[4778]: I0318 09:10:00.720756 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-lv4gn"] Mar 18 09:10:01 crc kubenswrapper[4778]: I0318 09:10:01.446901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" event={"ID":"105b6b5d-09f6-48c8-862e-c17526c6d6c7","Type":"ContainerStarted","Data":"9e7bc36695299640fb2dfac7b49d965a4df1526527c348b335d6ad804b13fb9c"} Mar 18 09:10:02 crc kubenswrapper[4778]: I0318 09:10:02.457356 4778 generic.go:334] "Generic (PLEG): container finished" podID="105b6b5d-09f6-48c8-862e-c17526c6d6c7" containerID="c5bd546fb47bde264ad4459aced4ba49381ccd5bb127c64ac227483b8bb621c0" exitCode=0 Mar 18 09:10:02 crc kubenswrapper[4778]: I0318 09:10:02.457662 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" event={"ID":"105b6b5d-09f6-48c8-862e-c17526c6d6c7","Type":"ContainerDied","Data":"c5bd546fb47bde264ad4459aced4ba49381ccd5bb127c64ac227483b8bb621c0"} Mar 18 09:10:03 crc kubenswrapper[4778]: I0318 09:10:03.818175 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:03 crc kubenswrapper[4778]: I0318 09:10:03.911185 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4btw\" (UniqueName: \"kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw\") pod \"105b6b5d-09f6-48c8-862e-c17526c6d6c7\" (UID: \"105b6b5d-09f6-48c8-862e-c17526c6d6c7\") " Mar 18 09:10:03 crc kubenswrapper[4778]: I0318 09:10:03.917421 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw" (OuterVolumeSpecName: "kube-api-access-v4btw") pod "105b6b5d-09f6-48c8-862e-c17526c6d6c7" (UID: "105b6b5d-09f6-48c8-862e-c17526c6d6c7"). InnerVolumeSpecName "kube-api-access-v4btw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.013047 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4btw\" (UniqueName: \"kubernetes.io/projected/105b6b5d-09f6-48c8-862e-c17526c6d6c7-kube-api-access-v4btw\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.475594 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" event={"ID":"105b6b5d-09f6-48c8-862e-c17526c6d6c7","Type":"ContainerDied","Data":"9e7bc36695299640fb2dfac7b49d965a4df1526527c348b335d6ad804b13fb9c"} Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.475658 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e7bc36695299640fb2dfac7b49d965a4df1526527c348b335d6ad804b13fb9c" Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.475697 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-lv4gn" Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.885306 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-btdt7"] Mar 18 09:10:04 crc kubenswrapper[4778]: I0318 09:10:04.890589 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-btdt7"] Mar 18 09:10:06 crc kubenswrapper[4778]: I0318 09:10:06.200349 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54961f10-93b0-433f-8a7d-b30d69178e9a" path="/var/lib/kubelet/pods/54961f10-93b0-433f-8a7d-b30d69178e9a/volumes" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.725421 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8ch6j"] Mar 18 09:10:21 crc kubenswrapper[4778]: E0318 09:10:21.726256 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="105b6b5d-09f6-48c8-862e-c17526c6d6c7" containerName="oc" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.726275 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="105b6b5d-09f6-48c8-862e-c17526c6d6c7" containerName="oc" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.726411 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="105b6b5d-09f6-48c8-862e-c17526c6d6c7" containerName="oc" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.726880 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.758043 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8ch6j"] Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852141 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg2qq\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-kube-api-access-qg2qq\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852264 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-trusted-ca\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852315 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-tls\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852343 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb3770db-238d-457b-ab2b-9fe59806cad4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852414 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-certificates\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852453 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-bound-sa-token\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852494 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.852549 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb3770db-238d-457b-ab2b-9fe59806cad4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.892057 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954540 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb3770db-238d-457b-ab2b-9fe59806cad4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954635 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg2qq\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-kube-api-access-qg2qq\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954685 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-trusted-ca\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb3770db-238d-457b-ab2b-9fe59806cad4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954772 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-tls\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954802 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-certificates\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.954855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-bound-sa-token\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.956307 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-certificates\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.956409 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb3770db-238d-457b-ab2b-9fe59806cad4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.957357 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb3770db-238d-457b-ab2b-9fe59806cad4-trusted-ca\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.964607 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-registry-tls\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.969436 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb3770db-238d-457b-ab2b-9fe59806cad4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.972883 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-bound-sa-token\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:21 crc kubenswrapper[4778]: I0318 09:10:21.986764 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg2qq\" (UniqueName: \"kubernetes.io/projected/bb3770db-238d-457b-ab2b-9fe59806cad4-kube-api-access-qg2qq\") pod \"image-registry-66df7c8f76-8ch6j\" (UID: \"bb3770db-238d-457b-ab2b-9fe59806cad4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:22 crc kubenswrapper[4778]: I0318 09:10:22.044125 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:22 crc kubenswrapper[4778]: I0318 09:10:22.338799 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8ch6j"] Mar 18 09:10:22 crc kubenswrapper[4778]: I0318 09:10:22.602353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" event={"ID":"bb3770db-238d-457b-ab2b-9fe59806cad4","Type":"ContainerStarted","Data":"7228a1578022fcc6eb924c542ac5792e8e9613cb0ce1327231c26d91fbec5c6d"} Mar 18 09:10:22 crc kubenswrapper[4778]: I0318 09:10:22.602402 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" event={"ID":"bb3770db-238d-457b-ab2b-9fe59806cad4","Type":"ContainerStarted","Data":"06acf5e1c2ce5e33d1c9970d9b6d4eb76f806908244a902017dbb58971091fe9"} Mar 18 09:10:22 crc kubenswrapper[4778]: I0318 09:10:22.603952 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.419700 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" podStartSLOduration=3.419670734 podStartE2EDuration="3.419670734s" podCreationTimestamp="2026-03-18 09:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:10:22.62098434 +0000 UTC m=+489.195729190" watchObservedRunningTime="2026-03-18 09:10:24.419670734 +0000 UTC m=+490.994415614" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.424262 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.425034 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qvn4w" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="registry-server" containerID="cri-o://70cbb8df67b66047dc2936fa97099c75389be5914cb70774619b80a3c1ca3b41" gracePeriod=30 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.451821 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.452146 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tbbtb" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="registry-server" containerID="cri-o://6d51e3ef30717618dbfd60d2700ddf309051ceaf03fe8bc0561397808fdc4760" gracePeriod=30 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.459031 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.459441 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" containerID="cri-o://f259658053fd1d484054ae2d51122a4e3408bb84a6c07568bf45f82cee5346ab" gracePeriod=30 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.475618 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.476659 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6qgm2" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="registry-server" containerID="cri-o://35627094815d8aab35b2d89ade00f4db12d2982f44d18065d3e2b7a8cff620a7" gracePeriod=30 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.484667 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.485224 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6kvnk" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="registry-server" containerID="cri-o://079c4231aac10f3612c674a07d9729d7a0fd2d54be2669f250d2d9ca937b5439" gracePeriod=30 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.490260 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jj774"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.491997 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.494812 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jj774"] Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.602336 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7sb4\" (UniqueName: \"kubernetes.io/projected/e037e8cd-1543-49a8-9389-4cc6f440c4b3-kube-api-access-x7sb4\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.602435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.603056 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.640327 4778 generic.go:334] "Generic (PLEG): container finished" podID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerID="70cbb8df67b66047dc2936fa97099c75389be5914cb70774619b80a3c1ca3b41" exitCode=0 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.640398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerDied","Data":"70cbb8df67b66047dc2936fa97099c75389be5914cb70774619b80a3c1ca3b41"} Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.643664 4778 generic.go:334] "Generic (PLEG): container finished" podID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerID="35627094815d8aab35b2d89ade00f4db12d2982f44d18065d3e2b7a8cff620a7" exitCode=0 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.643709 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerDied","Data":"35627094815d8aab35b2d89ade00f4db12d2982f44d18065d3e2b7a8cff620a7"} Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.645987 4778 generic.go:334] "Generic (PLEG): container finished" podID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerID="f259658053fd1d484054ae2d51122a4e3408bb84a6c07568bf45f82cee5346ab" exitCode=0 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.646043 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerDied","Data":"f259658053fd1d484054ae2d51122a4e3408bb84a6c07568bf45f82cee5346ab"} Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.646076 4778 scope.go:117] "RemoveContainer" containerID="90afc0123cf376859383c4f842d0587910af2413023c90f78391ff8bd752a9d6" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.648988 4778 generic.go:334] "Generic (PLEG): container finished" podID="938982a6-57b0-4870-abed-a98c42196ae6" containerID="079c4231aac10f3612c674a07d9729d7a0fd2d54be2669f250d2d9ca937b5439" exitCode=0 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.649035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerDied","Data":"079c4231aac10f3612c674a07d9729d7a0fd2d54be2669f250d2d9ca937b5439"} Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.651013 4778 generic.go:334] "Generic (PLEG): container finished" podID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerID="6d51e3ef30717618dbfd60d2700ddf309051ceaf03fe8bc0561397808fdc4760" exitCode=0 Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.651739 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerDied","Data":"6d51e3ef30717618dbfd60d2700ddf309051ceaf03fe8bc0561397808fdc4760"} Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.704033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.704140 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7sb4\" (UniqueName: \"kubernetes.io/projected/e037e8cd-1543-49a8-9389-4cc6f440c4b3-kube-api-access-x7sb4\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.704193 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.705539 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.716949 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e037e8cd-1543-49a8-9389-4cc6f440c4b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.724079 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7sb4\" (UniqueName: \"kubernetes.io/projected/e037e8cd-1543-49a8-9389-4cc6f440c4b3-kube-api-access-x7sb4\") pod \"marketplace-operator-79b997595-jj774\" (UID: \"e037e8cd-1543-49a8-9389-4cc6f440c4b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.928409 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.938324 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.942775 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.946564 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.952777 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:10:24 crc kubenswrapper[4778]: I0318 09:10:24.954182 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.108836 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics\") pod \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.108882 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities\") pod \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.108925 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp6w4\" (UniqueName: \"kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4\") pod \"938982a6-57b0-4870-abed-a98c42196ae6\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.108981 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf8tc\" (UniqueName: \"kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc\") pod \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109009 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content\") pod \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109033 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities\") pod \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109056 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcvjk\" (UniqueName: \"kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk\") pod \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109078 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content\") pod \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109124 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjz8f\" (UniqueName: \"kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f\") pod \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\" (UID: \"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109144 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") pod \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\" (UID: \"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109173 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content\") pod \"938982a6-57b0-4870-abed-a98c42196ae6\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109213 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities\") pod \"938982a6-57b0-4870-abed-a98c42196ae6\" (UID: \"938982a6-57b0-4870-abed-a98c42196ae6\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109241 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content\") pod \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109261 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6c42\" (UniqueName: \"kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42\") pod \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\" (UID: \"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.109468 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities\") pod \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\" (UID: \"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47\") " Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.113134 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities" (OuterVolumeSpecName: "utilities") pod "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" (UID: "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.116487 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42" (OuterVolumeSpecName: "kube-api-access-p6c42") pod "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" (UID: "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa"). InnerVolumeSpecName "kube-api-access-p6c42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.116760 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities" (OuterVolumeSpecName: "utilities") pod "938982a6-57b0-4870-abed-a98c42196ae6" (UID: "938982a6-57b0-4870-abed-a98c42196ae6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.116825 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4" (OuterVolumeSpecName: "kube-api-access-fp6w4") pod "938982a6-57b0-4870-abed-a98c42196ae6" (UID: "938982a6-57b0-4870-abed-a98c42196ae6"). InnerVolumeSpecName "kube-api-access-fp6w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.116897 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities" (OuterVolumeSpecName: "utilities") pod "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" (UID: "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.117550 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" (UID: "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.117820 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities" (OuterVolumeSpecName: "utilities") pod "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" (UID: "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.118158 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f" (OuterVolumeSpecName: "kube-api-access-vjz8f") pod "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" (UID: "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc"). InnerVolumeSpecName "kube-api-access-vjz8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.118406 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk" (OuterVolumeSpecName: "kube-api-access-xcvjk") pod "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" (UID: "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a"). InnerVolumeSpecName "kube-api-access-xcvjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.118648 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" (UID: "f25fe9ee-95f7-4a7c-98f1-7dabbd43527a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.118849 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc" (OuterVolumeSpecName: "kube-api-access-qf8tc") pod "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" (UID: "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47"). InnerVolumeSpecName "kube-api-access-qf8tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.173774 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" (UID: "57f9c6f6-c20e-4e28-aec4-f0104ddb2b47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.194478 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" (UID: "ed8eaf37-d7fe-43d1-8d20-fffdd71748cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.195390 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jj774"] Mar 18 09:10:25 crc kubenswrapper[4778]: W0318 09:10:25.200123 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode037e8cd_1543_49a8_9389_4cc6f440c4b3.slice/crio-c0527933ca28d30c64d338c93a15706da67519748e40cfb8d668bfc5f52809de WatchSource:0}: Error finding container c0527933ca28d30c64d338c93a15706da67519748e40cfb8d668bfc5f52809de: Status 404 returned error can't find the container with id c0527933ca28d30c64d338c93a15706da67519748e40cfb8d668bfc5f52809de Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212076 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp6w4\" (UniqueName: \"kubernetes.io/projected/938982a6-57b0-4870-abed-a98c42196ae6-kube-api-access-fp6w4\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212117 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf8tc\" (UniqueName: \"kubernetes.io/projected/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-kube-api-access-qf8tc\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212136 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212152 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212165 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcvjk\" (UniqueName: \"kubernetes.io/projected/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-kube-api-access-xcvjk\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212175 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjz8f\" (UniqueName: \"kubernetes.io/projected/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-kube-api-access-vjz8f\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212187 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212217 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212228 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212238 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6c42\" (UniqueName: \"kubernetes.io/projected/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-kube-api-access-p6c42\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212249 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212263 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.212275 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.214317 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" (UID: "b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.293523 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "938982a6-57b0-4870-abed-a98c42196ae6" (UID: "938982a6-57b0-4870-abed-a98c42196ae6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.313633 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938982a6-57b0-4870-abed-a98c42196ae6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.313677 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.658881 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qvn4w" event={"ID":"ed8eaf37-d7fe-43d1-8d20-fffdd71748cc","Type":"ContainerDied","Data":"5c2b4b3fdbc29641b0fd4b628d894c39c32fe66c2451fed6064a04a8d6f0eddd"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.660575 4778 scope.go:117] "RemoveContainer" containerID="70cbb8df67b66047dc2936fa97099c75389be5914cb70774619b80a3c1ca3b41" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.658906 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qvn4w" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.660609 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qgm2" event={"ID":"57f9c6f6-c20e-4e28-aec4-f0104ddb2b47","Type":"ContainerDied","Data":"a8ae2f2fdacacfb271851ed06f03f49aedc9a4cd93513577d22ad1180d7345ef"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.660657 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qgm2" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.676584 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.676907 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hr48" event={"ID":"f25fe9ee-95f7-4a7c-98f1-7dabbd43527a","Type":"ContainerDied","Data":"44fcaa7d9066c5bc322cc3c475c2c95ffa382825c1c11ff0bdf59ba686b15693"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.682648 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" event={"ID":"e037e8cd-1543-49a8-9389-4cc6f440c4b3","Type":"ContainerStarted","Data":"b29fe1dcf245e93feaf5a46740fb296a8a076e359c3e022be5d2b82ef6b2f575"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.682715 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" event={"ID":"e037e8cd-1543-49a8-9389-4cc6f440c4b3","Type":"ContainerStarted","Data":"c0527933ca28d30c64d338c93a15706da67519748e40cfb8d668bfc5f52809de"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.685735 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kvnk" event={"ID":"938982a6-57b0-4870-abed-a98c42196ae6","Type":"ContainerDied","Data":"40f7240695ce3bda0e1e4c2a6151b9a381bd8173f92f88c530036fbcfc0a002f"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.685848 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kvnk" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.691425 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbbtb" event={"ID":"b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa","Type":"ContainerDied","Data":"a612db4db085f0867c6a8718b7e590183f11921029875dedeb88f07468c98457"} Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.691579 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbbtb" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.710100 4778 scope.go:117] "RemoveContainer" containerID="206c2187bf0a136c6ff49b69bb1bb6cc918dc7e2a3d9bd6a2d8bac6ce3a51e5f" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.739324 4778 scope.go:117] "RemoveContainer" containerID="9e9b1baa8deb4596f595ec2a830346f2addf7d69c909efa6643ba0c90cdd01c7" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.747861 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.756330 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qvn4w"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.761446 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" podStartSLOduration=1.76141272 podStartE2EDuration="1.76141272s" podCreationTimestamp="2026-03-18 09:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:10:25.745992171 +0000 UTC m=+492.320737001" watchObservedRunningTime="2026-03-18 09:10:25.76141272 +0000 UTC m=+492.336157560" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.765555 4778 scope.go:117] "RemoveContainer" containerID="35627094815d8aab35b2d89ade00f4db12d2982f44d18065d3e2b7a8cff620a7" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.769936 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.778843 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qgm2"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.793055 4778 scope.go:117] "RemoveContainer" containerID="f49cf5ea04db3604b7012853be48f57eabfbbf2919ff145d883ab1c07e04a460" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.814628 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.817699 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6kvnk"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.818456 4778 scope.go:117] "RemoveContainer" containerID="113dc27ffd2ebd355aaf8e22c8a148444f799a56c796af33fbc9fe643673da94" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.827034 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.833056 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tbbtb"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.837384 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.841011 4778 scope.go:117] "RemoveContainer" containerID="f259658053fd1d484054ae2d51122a4e3408bb84a6c07568bf45f82cee5346ab" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.844629 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hr48"] Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.861161 4778 scope.go:117] "RemoveContainer" containerID="079c4231aac10f3612c674a07d9729d7a0fd2d54be2669f250d2d9ca937b5439" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.877922 4778 scope.go:117] "RemoveContainer" containerID="d3a1ccac938944a3f089f8c92f9aebbf40c8042f08e26dece0839acbb161f095" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.894696 4778 scope.go:117] "RemoveContainer" containerID="8977456d128ab832e4d2b65a1ebbe275173e48c92b3849c579d8a9cc853d0ce8" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.909801 4778 scope.go:117] "RemoveContainer" containerID="6d51e3ef30717618dbfd60d2700ddf309051ceaf03fe8bc0561397808fdc4760" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.922002 4778 scope.go:117] "RemoveContainer" containerID="080f512015bd3cc96010f06c24c5ff7c172a932d2f9c91b8b4e05d0e6fdb8776" Mar 18 09:10:25 crc kubenswrapper[4778]: I0318 09:10:25.939611 4778 scope.go:117] "RemoveContainer" containerID="e143a776ed51bb64025b24b3e1cc128e2a2ca67730b9a34f438ed6857f8be065" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.200653 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" path="/var/lib/kubelet/pods/57f9c6f6-c20e-4e28-aec4-f0104ddb2b47/volumes" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.201910 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="938982a6-57b0-4870-abed-a98c42196ae6" path="/var/lib/kubelet/pods/938982a6-57b0-4870-abed-a98c42196ae6/volumes" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.203080 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" path="/var/lib/kubelet/pods/b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa/volumes" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.205056 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" path="/var/lib/kubelet/pods/ed8eaf37-d7fe-43d1-8d20-fffdd71748cc/volumes" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.206307 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" path="/var/lib/kubelet/pods/f25fe9ee-95f7-4a7c-98f1-7dabbd43527a/volumes" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.633695 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xs85d"] Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635053 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635081 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635095 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635103 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635111 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635120 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635128 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635134 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635142 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635148 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635156 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635162 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635169 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635176 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635186 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635209 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635220 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635227 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635236 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635243 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635254 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635260 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="extract-content" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635272 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635278 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635287 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635293 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: E0318 09:10:26.635302 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635310 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="extract-utilities" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635417 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4cfa2f4-0114-46ae-a89f-3b2eac3ea0fa" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635426 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8eaf37-d7fe-43d1-8d20-fffdd71748cc" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635436 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635445 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f9c6f6-c20e-4e28-aec4-f0104ddb2b47" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635453 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="938982a6-57b0-4870-abed-a98c42196ae6" containerName="registry-server" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.635634 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25fe9ee-95f7-4a7c-98f1-7dabbd43527a" containerName="marketplace-operator" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.636637 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.638736 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.649671 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs85d"] Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.705112 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.708011 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jj774" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.733849 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn6s8\" (UniqueName: \"kubernetes.io/projected/0eaac9b5-67d6-4187-b118-0add20190689-kube-api-access-vn6s8\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.733995 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-catalog-content\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.734088 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-utilities\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.830870 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9b8p9"] Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.832174 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.835073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-utilities\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.835144 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn6s8\" (UniqueName: \"kubernetes.io/projected/0eaac9b5-67d6-4187-b118-0add20190689-kube-api-access-vn6s8\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.835228 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-catalog-content\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.835370 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.835963 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-utilities\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.836219 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eaac9b5-67d6-4187-b118-0add20190689-catalog-content\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.848058 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9b8p9"] Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.879724 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn6s8\" (UniqueName: \"kubernetes.io/projected/0eaac9b5-67d6-4187-b118-0add20190689-kube-api-access-vn6s8\") pod \"redhat-marketplace-xs85d\" (UID: \"0eaac9b5-67d6-4187-b118-0add20190689\") " pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.936836 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv59s\" (UniqueName: \"kubernetes.io/projected/f9a557a7-2d98-4e56-8119-acfd64357871-kube-api-access-kv59s\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.936912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-catalog-content\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.936936 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-utilities\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:26 crc kubenswrapper[4778]: I0318 09:10:26.969688 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.038104 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-catalog-content\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.038163 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-utilities\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.038250 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv59s\" (UniqueName: \"kubernetes.io/projected/f9a557a7-2d98-4e56-8119-acfd64357871-kube-api-access-kv59s\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.039043 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-utilities\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.040669 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a557a7-2d98-4e56-8119-acfd64357871-catalog-content\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.070246 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv59s\" (UniqueName: \"kubernetes.io/projected/f9a557a7-2d98-4e56-8119-acfd64357871-kube-api-access-kv59s\") pod \"redhat-operators-9b8p9\" (UID: \"f9a557a7-2d98-4e56-8119-acfd64357871\") " pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.152097 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.201870 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs85d"] Mar 18 09:10:27 crc kubenswrapper[4778]: W0318 09:10:27.212050 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eaac9b5_67d6_4187_b118_0add20190689.slice/crio-7839735294d147cf4ba43aa4d7887579df743a8b71976c9abcde9921692c3035 WatchSource:0}: Error finding container 7839735294d147cf4ba43aa4d7887579df743a8b71976c9abcde9921692c3035: Status 404 returned error can't find the container with id 7839735294d147cf4ba43aa4d7887579df743a8b71976c9abcde9921692c3035 Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.383231 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9b8p9"] Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.724715 4778 generic.go:334] "Generic (PLEG): container finished" podID="0eaac9b5-67d6-4187-b118-0add20190689" containerID="f388982a6381655f0b5c5f2a3da5b6d1b9bef2d39ef797f57925c20b9817472c" exitCode=0 Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.725516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs85d" event={"ID":"0eaac9b5-67d6-4187-b118-0add20190689","Type":"ContainerDied","Data":"f388982a6381655f0b5c5f2a3da5b6d1b9bef2d39ef797f57925c20b9817472c"} Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.725665 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs85d" event={"ID":"0eaac9b5-67d6-4187-b118-0add20190689","Type":"ContainerStarted","Data":"7839735294d147cf4ba43aa4d7887579df743a8b71976c9abcde9921692c3035"} Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.730944 4778 generic.go:334] "Generic (PLEG): container finished" podID="f9a557a7-2d98-4e56-8119-acfd64357871" containerID="e29ef92821e2044968655d435177477cc2b9361ff5419d7bccfdd357ea22baa2" exitCode=0 Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.731899 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b8p9" event={"ID":"f9a557a7-2d98-4e56-8119-acfd64357871","Type":"ContainerDied","Data":"e29ef92821e2044968655d435177477cc2b9361ff5419d7bccfdd357ea22baa2"} Mar 18 09:10:27 crc kubenswrapper[4778]: I0318 09:10:27.731934 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b8p9" event={"ID":"f9a557a7-2d98-4e56-8119-acfd64357871","Type":"ContainerStarted","Data":"cb95bef02ed3ea7cd6da9b563de2430ccb225bf3ee2ffbe0ba35413f02b45525"} Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.055161 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-csm2z"] Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.057624 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.069498 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.073456 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-csm2z"] Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.173034 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-catalog-content\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.173093 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-utilities\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.173583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w6qv\" (UniqueName: \"kubernetes.io/projected/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-kube-api-access-9w6qv\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.241322 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.247099 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.254828 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.263713 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.275849 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-catalog-content\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.275908 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-utilities\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.275979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w6qv\" (UniqueName: \"kubernetes.io/projected/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-kube-api-access-9w6qv\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.276693 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-catalog-content\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.276737 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-utilities\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.310870 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w6qv\" (UniqueName: \"kubernetes.io/projected/83efc97a-1a91-4bc8-90bf-a78bc8ee90e3-kube-api-access-9w6qv\") pod \"certified-operators-csm2z\" (UID: \"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3\") " pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.377995 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.378048 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s9q2\" (UniqueName: \"kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.378128 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.404290 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.479148 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.479243 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.479264 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s9q2\" (UniqueName: \"kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.479685 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.479717 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.514474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s9q2\" (UniqueName: \"kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2\") pod \"community-operators-ktcxn\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.574165 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.649437 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-csm2z"] Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.745365 4778 generic.go:334] "Generic (PLEG): container finished" podID="0eaac9b5-67d6-4187-b118-0add20190689" containerID="954929f7d2d9273de039af00dbe8ecaba4e10ae2e0ce60b95735a16f0a6ce1d7" exitCode=0 Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.745439 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs85d" event={"ID":"0eaac9b5-67d6-4187-b118-0add20190689","Type":"ContainerDied","Data":"954929f7d2d9273de039af00dbe8ecaba4e10ae2e0ce60b95735a16f0a6ce1d7"} Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.753086 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b8p9" event={"ID":"f9a557a7-2d98-4e56-8119-acfd64357871","Type":"ContainerStarted","Data":"b507290e4000ed980fb3ccd54bd103fde51b3f601acdd8010eaa6ec4edea7112"} Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.757715 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csm2z" event={"ID":"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3","Type":"ContainerStarted","Data":"1eb15b9c7e341bdbdafffdf98bfb8563afc6aafb0c6d1d7be9705c2b12903958"} Mar 18 09:10:29 crc kubenswrapper[4778]: I0318 09:10:29.760792 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:10:29 crc kubenswrapper[4778]: W0318 09:10:29.773093 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee87709_f8ed_4eb4_829e_1fdb6534bb35.slice/crio-1a86626e5d4576d55c9f62a59074b0761782b73c69b053e7829507e68652f471 WatchSource:0}: Error finding container 1a86626e5d4576d55c9f62a59074b0761782b73c69b053e7829507e68652f471: Status 404 returned error can't find the container with id 1a86626e5d4576d55c9f62a59074b0761782b73c69b053e7829507e68652f471 Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.147518 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.149312 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.766759 4778 generic.go:334] "Generic (PLEG): container finished" podID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerID="6ecbe80389c09da7c5dfaf24f572df1adb64cba289f74a3e8339845f8cebe749" exitCode=0 Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.766842 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerDied","Data":"6ecbe80389c09da7c5dfaf24f572df1adb64cba289f74a3e8339845f8cebe749"} Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.766879 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerStarted","Data":"1a86626e5d4576d55c9f62a59074b0761782b73c69b053e7829507e68652f471"} Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.770463 4778 generic.go:334] "Generic (PLEG): container finished" podID="83efc97a-1a91-4bc8-90bf-a78bc8ee90e3" containerID="001f9df6918d35a48d43c771d16934b8d15d4706ddeb0d1de385aab2a15e503e" exitCode=0 Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.770543 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csm2z" event={"ID":"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3","Type":"ContainerDied","Data":"001f9df6918d35a48d43c771d16934b8d15d4706ddeb0d1de385aab2a15e503e"} Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.773233 4778 generic.go:334] "Generic (PLEG): container finished" podID="f9a557a7-2d98-4e56-8119-acfd64357871" containerID="b507290e4000ed980fb3ccd54bd103fde51b3f601acdd8010eaa6ec4edea7112" exitCode=0 Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.773291 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b8p9" event={"ID":"f9a557a7-2d98-4e56-8119-acfd64357871","Type":"ContainerDied","Data":"b507290e4000ed980fb3ccd54bd103fde51b3f601acdd8010eaa6ec4edea7112"} Mar 18 09:10:30 crc kubenswrapper[4778]: I0318 09:10:30.778536 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs85d" event={"ID":"0eaac9b5-67d6-4187-b118-0add20190689","Type":"ContainerStarted","Data":"951eb8ce92ce0a97415382e79f2feb2fe4e10fb2df66c4882bfbffa91d09e171"} Mar 18 09:10:31 crc kubenswrapper[4778]: I0318 09:10:31.789273 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b8p9" event={"ID":"f9a557a7-2d98-4e56-8119-acfd64357871","Type":"ContainerStarted","Data":"98686129a658886ec0480e792b75f96d368f1fb9b7723c5a0396f19c73eb8f4f"} Mar 18 09:10:31 crc kubenswrapper[4778]: I0318 09:10:31.810424 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xs85d" podStartSLOduration=3.374203237 podStartE2EDuration="5.810395449s" podCreationTimestamp="2026-03-18 09:10:26 +0000 UTC" firstStartedPulling="2026-03-18 09:10:27.726737726 +0000 UTC m=+494.301482566" lastFinishedPulling="2026-03-18 09:10:30.162929938 +0000 UTC m=+496.737674778" observedRunningTime="2026-03-18 09:10:30.86581377 +0000 UTC m=+497.440558640" watchObservedRunningTime="2026-03-18 09:10:31.810395449 +0000 UTC m=+498.385140299" Mar 18 09:10:31 crc kubenswrapper[4778]: I0318 09:10:31.812023 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9b8p9" podStartSLOduration=2.154506702 podStartE2EDuration="5.812012673s" podCreationTimestamp="2026-03-18 09:10:26 +0000 UTC" firstStartedPulling="2026-03-18 09:10:27.73317439 +0000 UTC m=+494.307919230" lastFinishedPulling="2026-03-18 09:10:31.390680321 +0000 UTC m=+497.965425201" observedRunningTime="2026-03-18 09:10:31.805475586 +0000 UTC m=+498.380220436" watchObservedRunningTime="2026-03-18 09:10:31.812012673 +0000 UTC m=+498.386757533" Mar 18 09:10:32 crc kubenswrapper[4778]: I0318 09:10:32.797593 4778 generic.go:334] "Generic (PLEG): container finished" podID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerID="d37142aca8df005734457524dffa32c4483716edffbcfb2d1b92b3701d6e7e1c" exitCode=0 Mar 18 09:10:32 crc kubenswrapper[4778]: I0318 09:10:32.797717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerDied","Data":"d37142aca8df005734457524dffa32c4483716edffbcfb2d1b92b3701d6e7e1c"} Mar 18 09:10:33 crc kubenswrapper[4778]: I0318 09:10:33.809880 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerStarted","Data":"0d5497da92a3a6b067e66da6b34d1f0c05c8fc0ce853a92ab83f4966e6f9359e"} Mar 18 09:10:33 crc kubenswrapper[4778]: I0318 09:10:33.813889 4778 generic.go:334] "Generic (PLEG): container finished" podID="83efc97a-1a91-4bc8-90bf-a78bc8ee90e3" containerID="6fe6c8c9a13bac0fd88ed849c73dcfa6711eb3a6b0e4dbc9785b0295e53c5d5b" exitCode=0 Mar 18 09:10:33 crc kubenswrapper[4778]: I0318 09:10:33.813957 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csm2z" event={"ID":"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3","Type":"ContainerDied","Data":"6fe6c8c9a13bac0fd88ed849c73dcfa6711eb3a6b0e4dbc9785b0295e53c5d5b"} Mar 18 09:10:33 crc kubenswrapper[4778]: I0318 09:10:33.839071 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ktcxn" podStartSLOduration=2.233017312 podStartE2EDuration="4.839042243s" podCreationTimestamp="2026-03-18 09:10:29 +0000 UTC" firstStartedPulling="2026-03-18 09:10:30.769107265 +0000 UTC m=+497.343852105" lastFinishedPulling="2026-03-18 09:10:33.375132176 +0000 UTC m=+499.949877036" observedRunningTime="2026-03-18 09:10:33.836782122 +0000 UTC m=+500.411526972" watchObservedRunningTime="2026-03-18 09:10:33.839042243 +0000 UTC m=+500.413787083" Mar 18 09:10:34 crc kubenswrapper[4778]: I0318 09:10:34.823935 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csm2z" event={"ID":"83efc97a-1a91-4bc8-90bf-a78bc8ee90e3","Type":"ContainerStarted","Data":"dc3a0df91d3078511e57a266f8dc3cdf826fb116b6d9f3cbca75b54bd4b09de4"} Mar 18 09:10:34 crc kubenswrapper[4778]: I0318 09:10:34.850538 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-csm2z" podStartSLOduration=2.402036369 podStartE2EDuration="5.850509868s" podCreationTimestamp="2026-03-18 09:10:29 +0000 UTC" firstStartedPulling="2026-03-18 09:10:30.77221973 +0000 UTC m=+497.346964570" lastFinishedPulling="2026-03-18 09:10:34.220693209 +0000 UTC m=+500.795438069" observedRunningTime="2026-03-18 09:10:34.846436478 +0000 UTC m=+501.421181338" watchObservedRunningTime="2026-03-18 09:10:34.850509868 +0000 UTC m=+501.425254708" Mar 18 09:10:36 crc kubenswrapper[4778]: I0318 09:10:36.969930 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:36 crc kubenswrapper[4778]: I0318 09:10:36.970416 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:37 crc kubenswrapper[4778]: I0318 09:10:37.033476 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:37 crc kubenswrapper[4778]: I0318 09:10:37.154169 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:37 crc kubenswrapper[4778]: I0318 09:10:37.154268 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:37 crc kubenswrapper[4778]: I0318 09:10:37.916412 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xs85d" Mar 18 09:10:38 crc kubenswrapper[4778]: I0318 09:10:38.200546 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9b8p9" podUID="f9a557a7-2d98-4e56-8119-acfd64357871" containerName="registry-server" probeResult="failure" output=< Mar 18 09:10:38 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:10:38 crc kubenswrapper[4778]: > Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.404667 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.404722 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.459602 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.575456 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.575542 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.630828 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.917364 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:10:39 crc kubenswrapper[4778]: I0318 09:10:39.918337 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-csm2z" Mar 18 09:10:42 crc kubenswrapper[4778]: I0318 09:10:42.053533 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8ch6j" Mar 18 09:10:42 crc kubenswrapper[4778]: I0318 09:10:42.146041 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:10:47 crc kubenswrapper[4778]: I0318 09:10:47.225257 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:10:47 crc kubenswrapper[4778]: I0318 09:10:47.280576 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9b8p9" Mar 18 09:11:00 crc kubenswrapper[4778]: I0318 09:11:00.147554 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:11:00 crc kubenswrapper[4778]: I0318 09:11:00.148265 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:11:00 crc kubenswrapper[4778]: I0318 09:11:00.148314 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:11:00 crc kubenswrapper[4778]: I0318 09:11:00.149109 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:11:00 crc kubenswrapper[4778]: I0318 09:11:00.149217 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07" gracePeriod=600 Mar 18 09:11:01 crc kubenswrapper[4778]: I0318 09:11:01.013122 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07" exitCode=0 Mar 18 09:11:01 crc kubenswrapper[4778]: I0318 09:11:01.013268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07"} Mar 18 09:11:01 crc kubenswrapper[4778]: I0318 09:11:01.014250 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832"} Mar 18 09:11:01 crc kubenswrapper[4778]: I0318 09:11:01.014291 4778 scope.go:117] "RemoveContainer" containerID="cba57e9ecb9eb5e6f738793a924b86a332ecf3e9aa5748de2ff7d1c6195662bc" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.205678 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" podUID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" containerName="registry" containerID="cri-o://a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b" gracePeriod=30 Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.610704 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.773891 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774375 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774490 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774597 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774639 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774692 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsq4q\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774742 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.774805 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted\") pod \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\" (UID: \"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad\") " Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.777249 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.777739 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.783006 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.784748 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q" (OuterVolumeSpecName: "kube-api-access-fsq4q") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "kube-api-access-fsq4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.785170 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.785474 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.791479 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.799425 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" (UID: "2faf8fa8-d474-4c7d-8566-8abc58d7d5ad"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876650 4778 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876695 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876708 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876721 4778 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876736 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsq4q\" (UniqueName: \"kubernetes.io/projected/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-kube-api-access-fsq4q\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876748 4778 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4778]: I0318 09:11:07.876780 4778 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.075708 4778 generic.go:334] "Generic (PLEG): container finished" podID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" containerID="a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b" exitCode=0 Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.075754 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.075770 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" event={"ID":"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad","Type":"ContainerDied","Data":"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b"} Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.076632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nkbq4" event={"ID":"2faf8fa8-d474-4c7d-8566-8abc58d7d5ad","Type":"ContainerDied","Data":"c662bfec2aa4a3db7809425b04cf847a1727f28e0c070e4d7298a20004e2533f"} Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.076711 4778 scope.go:117] "RemoveContainer" containerID="a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b" Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.111807 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.117351 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nkbq4"] Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.123488 4778 scope.go:117] "RemoveContainer" containerID="a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b" Mar 18 09:11:08 crc kubenswrapper[4778]: E0318 09:11:08.124342 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b\": container with ID starting with a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b not found: ID does not exist" containerID="a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b" Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.124405 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b"} err="failed to get container status \"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b\": rpc error: code = NotFound desc = could not find container \"a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b\": container with ID starting with a38cac925bd870ea80dec1180740c780086128c764d678d27e64e5a2d8ce441b not found: ID does not exist" Mar 18 09:11:08 crc kubenswrapper[4778]: I0318 09:11:08.195087 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" path="/var/lib/kubelet/pods/2faf8fa8-d474-4c7d-8566-8abc58d7d5ad/volumes" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.135755 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563752-h72kq"] Mar 18 09:12:00 crc kubenswrapper[4778]: E0318 09:12:00.136581 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" containerName="registry" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.136597 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" containerName="registry" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.136737 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2faf8fa8-d474-4c7d-8566-8abc58d7d5ad" containerName="registry" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.137359 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.144154 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.144380 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.149599 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.157582 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-h72kq"] Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.256353 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5vng\" (UniqueName: \"kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng\") pod \"auto-csr-approver-29563752-h72kq\" (UID: \"57e614a6-a447-41bc-b7c8-034610af7d59\") " pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.358295 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5vng\" (UniqueName: \"kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng\") pod \"auto-csr-approver-29563752-h72kq\" (UID: \"57e614a6-a447-41bc-b7c8-034610af7d59\") " pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.383603 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5vng\" (UniqueName: \"kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng\") pod \"auto-csr-approver-29563752-h72kq\" (UID: \"57e614a6-a447-41bc-b7c8-034610af7d59\") " pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.469036 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.720861 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-h72kq"] Mar 18 09:12:00 crc kubenswrapper[4778]: I0318 09:12:00.724597 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:12:01 crc kubenswrapper[4778]: I0318 09:12:01.468260 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563752-h72kq" event={"ID":"57e614a6-a447-41bc-b7c8-034610af7d59","Type":"ContainerStarted","Data":"50988f6267519097811ac791abc59f70ec8c4ac956672a4ddaf299b87d47e582"} Mar 18 09:12:02 crc kubenswrapper[4778]: I0318 09:12:02.479027 4778 generic.go:334] "Generic (PLEG): container finished" podID="57e614a6-a447-41bc-b7c8-034610af7d59" containerID="6614d11a5de4463d54d3a021b1144b715f14eddffa1ef95f83bb20fa8f58ca90" exitCode=0 Mar 18 09:12:02 crc kubenswrapper[4778]: I0318 09:12:02.479093 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563752-h72kq" event={"ID":"57e614a6-a447-41bc-b7c8-034610af7d59","Type":"ContainerDied","Data":"6614d11a5de4463d54d3a021b1144b715f14eddffa1ef95f83bb20fa8f58ca90"} Mar 18 09:12:03 crc kubenswrapper[4778]: I0318 09:12:03.805047 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:03 crc kubenswrapper[4778]: I0318 09:12:03.938761 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5vng\" (UniqueName: \"kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng\") pod \"57e614a6-a447-41bc-b7c8-034610af7d59\" (UID: \"57e614a6-a447-41bc-b7c8-034610af7d59\") " Mar 18 09:12:03 crc kubenswrapper[4778]: I0318 09:12:03.945497 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng" (OuterVolumeSpecName: "kube-api-access-p5vng") pod "57e614a6-a447-41bc-b7c8-034610af7d59" (UID: "57e614a6-a447-41bc-b7c8-034610af7d59"). InnerVolumeSpecName "kube-api-access-p5vng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.040809 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5vng\" (UniqueName: \"kubernetes.io/projected/57e614a6-a447-41bc-b7c8-034610af7d59-kube-api-access-p5vng\") on node \"crc\" DevicePath \"\"" Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.499394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563752-h72kq" event={"ID":"57e614a6-a447-41bc-b7c8-034610af7d59","Type":"ContainerDied","Data":"50988f6267519097811ac791abc59f70ec8c4ac956672a4ddaf299b87d47e582"} Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.499471 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50988f6267519097811ac791abc59f70ec8c4ac956672a4ddaf299b87d47e582" Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.499574 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-h72kq" Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.873615 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-b66f7"] Mar 18 09:12:04 crc kubenswrapper[4778]: I0318 09:12:04.879736 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-b66f7"] Mar 18 09:12:06 crc kubenswrapper[4778]: I0318 09:12:06.193917 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3be356e-94af-47db-a182-dd8a57024619" path="/var/lib/kubelet/pods/c3be356e-94af-47db-a182-dd8a57024619/volumes" Mar 18 09:13:00 crc kubenswrapper[4778]: I0318 09:13:00.147796 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:13:00 crc kubenswrapper[4778]: I0318 09:13:00.148946 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:13:19 crc kubenswrapper[4778]: I0318 09:13:19.405110 4778 scope.go:117] "RemoveContainer" containerID="8b9319e52264a71946e18a681c32dbd6ffc04e6afcc03a59b8bfa719c7422b7f" Mar 18 09:13:19 crc kubenswrapper[4778]: I0318 09:13:19.433283 4778 scope.go:117] "RemoveContainer" containerID="44d1d0b1ecaf0bd45db18a8ca3c0502c00748ea75b870a51131c12eecf1aa1f8" Mar 18 09:13:30 crc kubenswrapper[4778]: I0318 09:13:30.147977 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:13:30 crc kubenswrapper[4778]: I0318 09:13:30.148587 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.138742 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563754-8p782"] Mar 18 09:14:00 crc kubenswrapper[4778]: E0318 09:14:00.139659 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e614a6-a447-41bc-b7c8-034610af7d59" containerName="oc" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.139676 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e614a6-a447-41bc-b7c8-034610af7d59" containerName="oc" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.139823 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e614a6-a447-41bc-b7c8-034610af7d59" containerName="oc" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.140384 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.148985 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.149095 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.149166 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.149519 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.149547 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.149633 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.150004 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.150068 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832" gracePeriod=600 Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.160696 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-8p782"] Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.243228 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6mfj\" (UniqueName: \"kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj\") pod \"auto-csr-approver-29563754-8p782\" (UID: \"913fd7d5-c271-4918-992c-95e6048faa85\") " pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.330670 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832" exitCode=0 Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.330732 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832"} Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.331279 4778 scope.go:117] "RemoveContainer" containerID="34445970a34bfac9ea81483ad63bd182faceb7cf5f4517903b0d318af28a7e07" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.345220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6mfj\" (UniqueName: \"kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj\") pod \"auto-csr-approver-29563754-8p782\" (UID: \"913fd7d5-c271-4918-992c-95e6048faa85\") " pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.369413 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6mfj\" (UniqueName: \"kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj\") pod \"auto-csr-approver-29563754-8p782\" (UID: \"913fd7d5-c271-4918-992c-95e6048faa85\") " pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.478149 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:00 crc kubenswrapper[4778]: I0318 09:14:00.915924 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-8p782"] Mar 18 09:14:01 crc kubenswrapper[4778]: I0318 09:14:01.341232 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d"} Mar 18 09:14:01 crc kubenswrapper[4778]: I0318 09:14:01.342660 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563754-8p782" event={"ID":"913fd7d5-c271-4918-992c-95e6048faa85","Type":"ContainerStarted","Data":"c4808469037255cf3739a39f462f2935e1242030d6b9156f4361ab4e5be94e70"} Mar 18 09:14:02 crc kubenswrapper[4778]: I0318 09:14:02.354494 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563754-8p782" event={"ID":"913fd7d5-c271-4918-992c-95e6048faa85","Type":"ContainerStarted","Data":"623e10ff390eb7e19703ecca951ddfb9b57c997906e5eae908a2c7aedc17d0d1"} Mar 18 09:14:02 crc kubenswrapper[4778]: I0318 09:14:02.377816 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563754-8p782" podStartSLOduration=1.417824869 podStartE2EDuration="2.377785821s" podCreationTimestamp="2026-03-18 09:14:00 +0000 UTC" firstStartedPulling="2026-03-18 09:14:00.92342652 +0000 UTC m=+707.498171360" lastFinishedPulling="2026-03-18 09:14:01.883387442 +0000 UTC m=+708.458132312" observedRunningTime="2026-03-18 09:14:02.375683863 +0000 UTC m=+708.950428763" watchObservedRunningTime="2026-03-18 09:14:02.377785821 +0000 UTC m=+708.952530711" Mar 18 09:14:03 crc kubenswrapper[4778]: I0318 09:14:03.364323 4778 generic.go:334] "Generic (PLEG): container finished" podID="913fd7d5-c271-4918-992c-95e6048faa85" containerID="623e10ff390eb7e19703ecca951ddfb9b57c997906e5eae908a2c7aedc17d0d1" exitCode=0 Mar 18 09:14:03 crc kubenswrapper[4778]: I0318 09:14:03.364391 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563754-8p782" event={"ID":"913fd7d5-c271-4918-992c-95e6048faa85","Type":"ContainerDied","Data":"623e10ff390eb7e19703ecca951ddfb9b57c997906e5eae908a2c7aedc17d0d1"} Mar 18 09:14:04 crc kubenswrapper[4778]: I0318 09:14:04.689255 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:04 crc kubenswrapper[4778]: I0318 09:14:04.740263 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6mfj\" (UniqueName: \"kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj\") pod \"913fd7d5-c271-4918-992c-95e6048faa85\" (UID: \"913fd7d5-c271-4918-992c-95e6048faa85\") " Mar 18 09:14:04 crc kubenswrapper[4778]: I0318 09:14:04.747045 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj" (OuterVolumeSpecName: "kube-api-access-b6mfj") pod "913fd7d5-c271-4918-992c-95e6048faa85" (UID: "913fd7d5-c271-4918-992c-95e6048faa85"). InnerVolumeSpecName "kube-api-access-b6mfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:14:04 crc kubenswrapper[4778]: I0318 09:14:04.842973 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6mfj\" (UniqueName: \"kubernetes.io/projected/913fd7d5-c271-4918-992c-95e6048faa85-kube-api-access-b6mfj\") on node \"crc\" DevicePath \"\"" Mar 18 09:14:05 crc kubenswrapper[4778]: I0318 09:14:05.383974 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563754-8p782" event={"ID":"913fd7d5-c271-4918-992c-95e6048faa85","Type":"ContainerDied","Data":"c4808469037255cf3739a39f462f2935e1242030d6b9156f4361ab4e5be94e70"} Mar 18 09:14:05 crc kubenswrapper[4778]: I0318 09:14:05.384035 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-8p782" Mar 18 09:14:05 crc kubenswrapper[4778]: I0318 09:14:05.384050 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4808469037255cf3739a39f462f2935e1242030d6b9156f4361ab4e5be94e70" Mar 18 09:14:05 crc kubenswrapper[4778]: I0318 09:14:05.447518 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-8q2hs"] Mar 18 09:14:05 crc kubenswrapper[4778]: I0318 09:14:05.451087 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-8q2hs"] Mar 18 09:14:06 crc kubenswrapper[4778]: I0318 09:14:06.199485 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4f1c72-13f2-47ff-94aa-9b4e91f2e126" path="/var/lib/kubelet/pods/5d4f1c72-13f2-47ff-94aa-9b4e91f2e126/volumes" Mar 18 09:14:19 crc kubenswrapper[4778]: I0318 09:14:19.490390 4778 scope.go:117] "RemoveContainer" containerID="de8929b794bb5d2a7228965e1493c7e14a2590362f85b344f68eb15fc61bd4bf" Mar 18 09:14:19 crc kubenswrapper[4778]: I0318 09:14:19.516583 4778 scope.go:117] "RemoveContainer" containerID="2aaa498f349b23cf7a4f0fb9da41ba553f76ed88636548c40f4f1cf1a8220b22" Mar 18 09:14:19 crc kubenswrapper[4778]: I0318 09:14:19.565571 4778 scope.go:117] "RemoveContainer" containerID="cf4a9ddbe48af9c3f976ba168fa13253c79814734a6e5e0e3ef5fa348e79df80" Mar 18 09:14:19 crc kubenswrapper[4778]: I0318 09:14:19.618825 4778 scope.go:117] "RemoveContainer" containerID="6182c0f78e6f784c954f3d716e1ee189ef539115eb9b295b92f5ba494516c3c6" Mar 18 09:14:19 crc kubenswrapper[4778]: I0318 09:14:19.640945 4778 scope.go:117] "RemoveContainer" containerID="8bac8cffd4e1a60ba2bf0f1dd076bc9102bdfd1bb1f8e85a389ddde5e582bd3e" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.149913 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6"] Mar 18 09:15:00 crc kubenswrapper[4778]: E0318 09:15:00.150859 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913fd7d5-c271-4918-992c-95e6048faa85" containerName="oc" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.150879 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="913fd7d5-c271-4918-992c-95e6048faa85" containerName="oc" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.151021 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="913fd7d5-c271-4918-992c-95e6048faa85" containerName="oc" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.151546 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.154697 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.154717 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.170981 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6"] Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.212466 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.212568 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n5nc\" (UniqueName: \"kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.212621 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.313738 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.313809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.313875 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n5nc\" (UniqueName: \"kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.314973 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.325516 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.347000 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n5nc\" (UniqueName: \"kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc\") pod \"collect-profiles-29563755-zdrp6\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.481777 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.732994 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6"] Mar 18 09:15:00 crc kubenswrapper[4778]: I0318 09:15:00.829761 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" event={"ID":"bea72845-4b27-4381-b08b-e0570c67bddb","Type":"ContainerStarted","Data":"1680f1b14b94b34a9719dba11e6814c744db638f9e5b99a93b4ed4014f924dba"} Mar 18 09:15:01 crc kubenswrapper[4778]: I0318 09:15:01.839406 4778 generic.go:334] "Generic (PLEG): container finished" podID="bea72845-4b27-4381-b08b-e0570c67bddb" containerID="eaf004108fc735124a6750b445bf1e3f7676efb1a3da3a71036d9a0909c64710" exitCode=0 Mar 18 09:15:01 crc kubenswrapper[4778]: I0318 09:15:01.839492 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" event={"ID":"bea72845-4b27-4381-b08b-e0570c67bddb","Type":"ContainerDied","Data":"eaf004108fc735124a6750b445bf1e3f7676efb1a3da3a71036d9a0909c64710"} Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.105165 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.253969 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume\") pod \"bea72845-4b27-4381-b08b-e0570c67bddb\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.254326 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n5nc\" (UniqueName: \"kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc\") pod \"bea72845-4b27-4381-b08b-e0570c67bddb\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.254368 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume\") pod \"bea72845-4b27-4381-b08b-e0570c67bddb\" (UID: \"bea72845-4b27-4381-b08b-e0570c67bddb\") " Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.255123 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume" (OuterVolumeSpecName: "config-volume") pod "bea72845-4b27-4381-b08b-e0570c67bddb" (UID: "bea72845-4b27-4381-b08b-e0570c67bddb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.259803 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bea72845-4b27-4381-b08b-e0570c67bddb" (UID: "bea72845-4b27-4381-b08b-e0570c67bddb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.259981 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc" (OuterVolumeSpecName: "kube-api-access-8n5nc") pod "bea72845-4b27-4381-b08b-e0570c67bddb" (UID: "bea72845-4b27-4381-b08b-e0570c67bddb"). InnerVolumeSpecName "kube-api-access-8n5nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.355793 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n5nc\" (UniqueName: \"kubernetes.io/projected/bea72845-4b27-4381-b08b-e0570c67bddb-kube-api-access-8n5nc\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.355842 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea72845-4b27-4381-b08b-e0570c67bddb-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.355855 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea72845-4b27-4381-b08b-e0570c67bddb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.864948 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" event={"ID":"bea72845-4b27-4381-b08b-e0570c67bddb","Type":"ContainerDied","Data":"1680f1b14b94b34a9719dba11e6814c744db638f9e5b99a93b4ed4014f924dba"} Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.865003 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1680f1b14b94b34a9719dba11e6814c744db638f9e5b99a93b4ed4014f924dba" Mar 18 09:15:03 crc kubenswrapper[4778]: I0318 09:15:03.865055 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.343110 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-khqrg"] Mar 18 09:15:40 crc kubenswrapper[4778]: E0318 09:15:40.344104 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea72845-4b27-4381-b08b-e0570c67bddb" containerName="collect-profiles" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.344122 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea72845-4b27-4381-b08b-e0570c67bddb" containerName="collect-profiles" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.344256 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea72845-4b27-4381-b08b-e0570c67bddb" containerName="collect-profiles" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.344913 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.346642 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-qrqw4"] Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.347277 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.347446 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-qrqw4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.347728 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-knchm" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.349078 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.350562 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-gsvnx" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.362416 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-khqrg"] Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.376786 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-qrqw4"] Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.381284 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hjskg"] Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.382300 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.387179 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-gnbr4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.397493 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hjskg"] Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.498579 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rbf\" (UniqueName: \"kubernetes.io/projected/24a88e8d-e986-4b3d-a77e-1a3e5162ac9c-kube-api-access-p6rbf\") pod \"cert-manager-cainjector-cf98fcc89-khqrg\" (UID: \"24a88e8d-e986-4b3d-a77e-1a3e5162ac9c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.498660 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvlw6\" (UniqueName: \"kubernetes.io/projected/e39be52c-c244-44cc-a707-0ec9994991fa-kube-api-access-wvlw6\") pod \"cert-manager-858654f9db-qrqw4\" (UID: \"e39be52c-c244-44cc-a707-0ec9994991fa\") " pod="cert-manager/cert-manager-858654f9db-qrqw4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.498704 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v6l6\" (UniqueName: \"kubernetes.io/projected/f09bc4b7-d305-4674-8540-283bd0b4901c-kube-api-access-8v6l6\") pod \"cert-manager-webhook-687f57d79b-hjskg\" (UID: \"f09bc4b7-d305-4674-8540-283bd0b4901c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.599957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6rbf\" (UniqueName: \"kubernetes.io/projected/24a88e8d-e986-4b3d-a77e-1a3e5162ac9c-kube-api-access-p6rbf\") pod \"cert-manager-cainjector-cf98fcc89-khqrg\" (UID: \"24a88e8d-e986-4b3d-a77e-1a3e5162ac9c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.600499 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvlw6\" (UniqueName: \"kubernetes.io/projected/e39be52c-c244-44cc-a707-0ec9994991fa-kube-api-access-wvlw6\") pod \"cert-manager-858654f9db-qrqw4\" (UID: \"e39be52c-c244-44cc-a707-0ec9994991fa\") " pod="cert-manager/cert-manager-858654f9db-qrqw4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.600649 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v6l6\" (UniqueName: \"kubernetes.io/projected/f09bc4b7-d305-4674-8540-283bd0b4901c-kube-api-access-8v6l6\") pod \"cert-manager-webhook-687f57d79b-hjskg\" (UID: \"f09bc4b7-d305-4674-8540-283bd0b4901c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.624162 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v6l6\" (UniqueName: \"kubernetes.io/projected/f09bc4b7-d305-4674-8540-283bd0b4901c-kube-api-access-8v6l6\") pod \"cert-manager-webhook-687f57d79b-hjskg\" (UID: \"f09bc4b7-d305-4674-8540-283bd0b4901c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.624981 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6rbf\" (UniqueName: \"kubernetes.io/projected/24a88e8d-e986-4b3d-a77e-1a3e5162ac9c-kube-api-access-p6rbf\") pod \"cert-manager-cainjector-cf98fcc89-khqrg\" (UID: \"24a88e8d-e986-4b3d-a77e-1a3e5162ac9c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.626060 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvlw6\" (UniqueName: \"kubernetes.io/projected/e39be52c-c244-44cc-a707-0ec9994991fa-kube-api-access-wvlw6\") pod \"cert-manager-858654f9db-qrqw4\" (UID: \"e39be52c-c244-44cc-a707-0ec9994991fa\") " pod="cert-manager/cert-manager-858654f9db-qrqw4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.676712 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.682425 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-qrqw4" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.703260 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:40 crc kubenswrapper[4778]: I0318 09:15:40.932545 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-qrqw4"] Mar 18 09:15:41 crc kubenswrapper[4778]: I0318 09:15:41.048690 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hjskg"] Mar 18 09:15:41 crc kubenswrapper[4778]: I0318 09:15:41.117088 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" event={"ID":"f09bc4b7-d305-4674-8540-283bd0b4901c","Type":"ContainerStarted","Data":"2636fa1736c531caefda46d06d12a012800db27e5a6b874ddc7d8e3f9541556d"} Mar 18 09:15:41 crc kubenswrapper[4778]: I0318 09:15:41.118921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-qrqw4" event={"ID":"e39be52c-c244-44cc-a707-0ec9994991fa","Type":"ContainerStarted","Data":"bf6a92a55d75d6ad6258d8f91d078b85e774d97d2928d96666c717f85f7782bf"} Mar 18 09:15:41 crc kubenswrapper[4778]: I0318 09:15:41.208952 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-khqrg"] Mar 18 09:15:41 crc kubenswrapper[4778]: W0318 09:15:41.211838 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a88e8d_e986_4b3d_a77e_1a3e5162ac9c.slice/crio-0284f6c8d4aafa5f2fd98d8177af50b62207aae5dffd33ca1f450ca5d211f478 WatchSource:0}: Error finding container 0284f6c8d4aafa5f2fd98d8177af50b62207aae5dffd33ca1f450ca5d211f478: Status 404 returned error can't find the container with id 0284f6c8d4aafa5f2fd98d8177af50b62207aae5dffd33ca1f450ca5d211f478 Mar 18 09:15:42 crc kubenswrapper[4778]: I0318 09:15:42.135945 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" event={"ID":"24a88e8d-e986-4b3d-a77e-1a3e5162ac9c","Type":"ContainerStarted","Data":"0284f6c8d4aafa5f2fd98d8177af50b62207aae5dffd33ca1f450ca5d211f478"} Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.167500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-qrqw4" event={"ID":"e39be52c-c244-44cc-a707-0ec9994991fa","Type":"ContainerStarted","Data":"8f9cfe7da34ac42a2298776c6bb8ad1ccbec9bf0c6b0ab3b018a2f639e8bdd94"} Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.171720 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" event={"ID":"24a88e8d-e986-4b3d-a77e-1a3e5162ac9c","Type":"ContainerStarted","Data":"5858b5fbe03770a0084b32004e9ab61fd80c6618dc9a4f60dce661ecdd4cc189"} Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.175488 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" event={"ID":"f09bc4b7-d305-4674-8540-283bd0b4901c","Type":"ContainerStarted","Data":"4579c79beff6826d27512141d9d9cce950a0840faa26c3954db5d83d65971556"} Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.176240 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.199027 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-qrqw4" podStartSLOduration=2.174405188 podStartE2EDuration="6.198996801s" podCreationTimestamp="2026-03-18 09:15:40 +0000 UTC" firstStartedPulling="2026-03-18 09:15:40.948743108 +0000 UTC m=+807.523487948" lastFinishedPulling="2026-03-18 09:15:44.973334701 +0000 UTC m=+811.548079561" observedRunningTime="2026-03-18 09:15:46.189018867 +0000 UTC m=+812.763763767" watchObservedRunningTime="2026-03-18 09:15:46.198996801 +0000 UTC m=+812.773741681" Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.235445 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-khqrg" podStartSLOduration=3.199536116 podStartE2EDuration="6.23541984s" podCreationTimestamp="2026-03-18 09:15:40 +0000 UTC" firstStartedPulling="2026-03-18 09:15:41.215498675 +0000 UTC m=+807.790243515" lastFinishedPulling="2026-03-18 09:15:44.251382399 +0000 UTC m=+810.826127239" observedRunningTime="2026-03-18 09:15:46.222859885 +0000 UTC m=+812.797604745" watchObservedRunningTime="2026-03-18 09:15:46.23541984 +0000 UTC m=+812.810164710" Mar 18 09:15:46 crc kubenswrapper[4778]: I0318 09:15:46.241763 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" podStartSLOduration=2.381147259 podStartE2EDuration="6.241743774s" podCreationTimestamp="2026-03-18 09:15:40 +0000 UTC" firstStartedPulling="2026-03-18 09:15:41.05600991 +0000 UTC m=+807.630754750" lastFinishedPulling="2026-03-18 09:15:44.916606415 +0000 UTC m=+811.491351265" observedRunningTime="2026-03-18 09:15:46.23943095 +0000 UTC m=+812.814175810" watchObservedRunningTime="2026-03-18 09:15:46.241743774 +0000 UTC m=+812.816488654" Mar 18 09:15:50 crc kubenswrapper[4778]: I0318 09:15:50.708881 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-hjskg" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.045386 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g2qth"] Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047020 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-controller" containerID="cri-o://04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047156 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="sbdb" containerID="cri-o://561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047099 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-node" containerID="cri-o://42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047260 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-acl-logging" containerID="cri-o://8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047136 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047153 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="nbdb" containerID="cri-o://031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.047037 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="northd" containerID="cri-o://bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.106258 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" containerID="cri-o://236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" gracePeriod=30 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.226885 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/2.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.227506 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/1.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.227611 4778 generic.go:334] "Generic (PLEG): container finished" podID="dce973f3-25e6-4536-87cc-9b46499ad7cf" containerID="f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0" exitCode=2 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.227728 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerDied","Data":"f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0"} Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.227834 4778 scope.go:117] "RemoveContainer" containerID="3db916fe08447492951425cea1874cda24609695f9a809c15d9596472ae1c562" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.228437 4778 scope.go:117] "RemoveContainer" containerID="f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.228689 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf)\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.241506 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.250624 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-acl-logging/0.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.251134 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-controller/0.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.252863 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" exitCode=0 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.252961 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.253035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.252988 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" exitCode=0 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.253060 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" exitCode=143 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.253075 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" exitCode=143 Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.253095 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.253104 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.393893 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.397279 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-acl-logging/0.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.398312 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-controller/0.log" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.398834 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475214 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wj8wh"] Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475607 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-node" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475629 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-node" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475650 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475661 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475673 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="northd" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475683 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="northd" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475697 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="sbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475705 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="sbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475715 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475723 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475735 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-acl-logging" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475744 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-acl-logging" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475753 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="nbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475761 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="nbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475772 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kubecfg-setup" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475783 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kubecfg-setup" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475795 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475802 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475812 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475822 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475833 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475842 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.475854 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.475862 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476019 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-acl-logging" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476034 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476043 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476051 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476060 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovn-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476072 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="kube-rbac-proxy-node" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476079 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476088 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="nbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476100 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="sbdb" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476112 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="northd" Mar 18 09:15:53 crc kubenswrapper[4778]: E0318 09:15:53.476264 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476277 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476415 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.476427 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerName="ovnkube-controller" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.478918 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502481 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502557 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502599 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502643 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502626 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502687 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502738 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502763 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502791 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502826 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket" (OuterVolumeSpecName: "log-socket") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502847 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502859 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502881 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502888 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502911 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502932 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502969 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.502995 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503015 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503041 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503079 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503101 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503135 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8g6f\" (UniqueName: \"kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503212 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503308 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn\") pod \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\" (UID: \"ef97d63e-1caf-44c9-ac0c-9b03dbd05113\") " Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503326 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503358 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503376 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503394 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log" (OuterVolumeSpecName: "node-log") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503411 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash" (OuterVolumeSpecName: "host-slash") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503810 4778 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503825 4778 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503835 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503843 4778 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503851 4778 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503860 4778 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503867 4778 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503876 4778 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503886 4778 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503894 4778 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503902 4778 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.503934 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.504266 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.504295 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.504388 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.504437 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.504932 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.510803 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.515508 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f" (OuterVolumeSpecName: "kube-api-access-b8g6f") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "kube-api-access-b8g6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.529528 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ef97d63e-1caf-44c9-ac0c-9b03dbd05113" (UID: "ef97d63e-1caf-44c9-ac0c-9b03dbd05113"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605060 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-node-log\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605139 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-netd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605191 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovn-node-metrics-cert\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605306 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-log-socket\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605363 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-ovn\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605454 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605559 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605758 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-var-lib-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605863 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-kubelet\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.605945 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5brj\" (UniqueName: \"kubernetes.io/projected/ffcc7085-b304-4ae4-a764-907d0ce857ea-kube-api-access-f5brj\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606177 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-script-lib\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606249 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-bin\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606307 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-config\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606333 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-etc-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606369 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606452 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-netns\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606499 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-slash\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606536 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-systemd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606563 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-systemd-units\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606704 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-env-overrides\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606816 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606840 4778 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606853 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606866 4778 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606877 4778 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606888 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606901 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8g6f\" (UniqueName: \"kubernetes.io/projected/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-kube-api-access-b8g6f\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606912 4778 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.606927 4778 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef97d63e-1caf-44c9-ac0c-9b03dbd05113-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.707934 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-systemd-units\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708030 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-env-overrides\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708074 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-node-log\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708119 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-netd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708132 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-systemd-units\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708159 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovn-node-metrics-cert\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708370 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-log-socket\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708404 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-netd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-ovn\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708461 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-node-log\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708518 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-ovn\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708528 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708566 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-log-socket\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708572 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708606 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708675 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-var-lib-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708731 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-kubelet\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708726 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708760 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-var-lib-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708784 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5brj\" (UniqueName: \"kubernetes.io/projected/ffcc7085-b304-4ae4-a764-907d0ce857ea-kube-api-access-f5brj\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708838 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-script-lib\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708868 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-kubelet\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708894 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-bin\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.708946 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-cni-bin\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.709817 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-config\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.709883 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-etc-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.709961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.709545 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-env-overrides\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710012 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710028 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-netns\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-script-lib\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.709968 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-etc-openvswitch\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710086 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-slash\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710138 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-slash\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710157 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-host-run-netns\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710231 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-systemd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710143 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffcc7085-b304-4ae4-a764-907d0ce857ea-run-systemd\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.710977 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovnkube-config\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.714181 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffcc7085-b304-4ae4-a764-907d0ce857ea-ovn-node-metrics-cert\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.740048 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5brj\" (UniqueName: \"kubernetes.io/projected/ffcc7085-b304-4ae4-a764-907d0ce857ea-kube-api-access-f5brj\") pod \"ovnkube-node-wj8wh\" (UID: \"ffcc7085-b304-4ae4-a764-907d0ce857ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: I0318 09:15:53.801240 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:15:53 crc kubenswrapper[4778]: W0318 09:15:53.837794 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffcc7085_b304_4ae4_a764_907d0ce857ea.slice/crio-f5e3196f4e1df6024633bf365c8a62435b4fd2f1fa37d7df196ed30f481ed35c WatchSource:0}: Error finding container f5e3196f4e1df6024633bf365c8a62435b4fd2f1fa37d7df196ed30f481ed35c: Status 404 returned error can't find the container with id f5e3196f4e1df6024633bf365c8a62435b4fd2f1fa37d7df196ed30f481ed35c Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.260921 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/2.log" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.264786 4778 generic.go:334] "Generic (PLEG): container finished" podID="ffcc7085-b304-4ae4-a764-907d0ce857ea" containerID="d21377864ded5d0a1002ab2279c8c423e8c8e6ade6bb88e057bc60a56e81ccbd" exitCode=0 Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.264861 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerDied","Data":"d21377864ded5d0a1002ab2279c8c423e8c8e6ade6bb88e057bc60a56e81ccbd"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.265143 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"f5e3196f4e1df6024633bf365c8a62435b4fd2f1fa37d7df196ed30f481ed35c"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.267975 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovnkube-controller/3.log" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.272912 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-acl-logging/0.log" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274120 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2qth_ef97d63e-1caf-44c9-ac0c-9b03dbd05113/ovn-controller/0.log" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274520 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" exitCode=0 Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274544 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" exitCode=0 Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274553 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" exitCode=0 Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274563 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" exitCode=0 Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274586 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274618 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274645 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" event={"ID":"ef97d63e-1caf-44c9-ac0c-9b03dbd05113","Type":"ContainerDied","Data":"0662ffa0aa3c3fc99a955a63b995acc6a492d7cf6b911968c3e8039e26cdcb7f"} Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274677 4778 scope.go:117] "RemoveContainer" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.274854 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g2qth" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.302305 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.330291 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g2qth"] Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.333937 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g2qth"] Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.353097 4778 scope.go:117] "RemoveContainer" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.397621 4778 scope.go:117] "RemoveContainer" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.435086 4778 scope.go:117] "RemoveContainer" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.465432 4778 scope.go:117] "RemoveContainer" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.491254 4778 scope.go:117] "RemoveContainer" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.522375 4778 scope.go:117] "RemoveContainer" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.544939 4778 scope.go:117] "RemoveContainer" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.567325 4778 scope.go:117] "RemoveContainer" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.593498 4778 scope.go:117] "RemoveContainer" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.595104 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": container with ID starting with 236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb not found: ID does not exist" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.595143 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} err="failed to get container status \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": rpc error: code = NotFound desc = could not find container \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": container with ID starting with 236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.595172 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.595854 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": container with ID starting with 5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d not found: ID does not exist" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.595884 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} err="failed to get container status \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": rpc error: code = NotFound desc = could not find container \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": container with ID starting with 5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.595903 4778 scope.go:117] "RemoveContainer" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.597185 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": container with ID starting with 561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f not found: ID does not exist" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.597242 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} err="failed to get container status \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": rpc error: code = NotFound desc = could not find container \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": container with ID starting with 561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.597261 4778 scope.go:117] "RemoveContainer" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.597842 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": container with ID starting with 031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147 not found: ID does not exist" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.597894 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} err="failed to get container status \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": rpc error: code = NotFound desc = could not find container \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": container with ID starting with 031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.597920 4778 scope.go:117] "RemoveContainer" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.598397 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": container with ID starting with bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3 not found: ID does not exist" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.598435 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} err="failed to get container status \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": rpc error: code = NotFound desc = could not find container \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": container with ID starting with bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.598455 4778 scope.go:117] "RemoveContainer" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.598949 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": container with ID starting with 94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b not found: ID does not exist" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.598991 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} err="failed to get container status \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": rpc error: code = NotFound desc = could not find container \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": container with ID starting with 94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.599018 4778 scope.go:117] "RemoveContainer" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.599439 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": container with ID starting with 42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5 not found: ID does not exist" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.599472 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} err="failed to get container status \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": rpc error: code = NotFound desc = could not find container \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": container with ID starting with 42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.599491 4778 scope.go:117] "RemoveContainer" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.600051 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": container with ID starting with 8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0 not found: ID does not exist" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.600091 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} err="failed to get container status \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": rpc error: code = NotFound desc = could not find container \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": container with ID starting with 8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.600115 4778 scope.go:117] "RemoveContainer" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.600591 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": container with ID starting with 04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f not found: ID does not exist" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.600645 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} err="failed to get container status \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": rpc error: code = NotFound desc = could not find container \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": container with ID starting with 04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.600683 4778 scope.go:117] "RemoveContainer" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: E0318 09:15:54.601388 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": container with ID starting with 5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107 not found: ID does not exist" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.601484 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107"} err="failed to get container status \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": rpc error: code = NotFound desc = could not find container \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": container with ID starting with 5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.601533 4778 scope.go:117] "RemoveContainer" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.603804 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} err="failed to get container status \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": rpc error: code = NotFound desc = could not find container \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": container with ID starting with 236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.603835 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.604369 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} err="failed to get container status \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": rpc error: code = NotFound desc = could not find container \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": container with ID starting with 5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.604397 4778 scope.go:117] "RemoveContainer" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.604811 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} err="failed to get container status \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": rpc error: code = NotFound desc = could not find container \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": container with ID starting with 561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.604851 4778 scope.go:117] "RemoveContainer" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.606542 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} err="failed to get container status \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": rpc error: code = NotFound desc = could not find container \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": container with ID starting with 031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.606571 4778 scope.go:117] "RemoveContainer" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.606926 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} err="failed to get container status \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": rpc error: code = NotFound desc = could not find container \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": container with ID starting with bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.606951 4778 scope.go:117] "RemoveContainer" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.607245 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} err="failed to get container status \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": rpc error: code = NotFound desc = could not find container \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": container with ID starting with 94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.607278 4778 scope.go:117] "RemoveContainer" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.607704 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} err="failed to get container status \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": rpc error: code = NotFound desc = could not find container \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": container with ID starting with 42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.607742 4778 scope.go:117] "RemoveContainer" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.608019 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} err="failed to get container status \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": rpc error: code = NotFound desc = could not find container \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": container with ID starting with 8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.608049 4778 scope.go:117] "RemoveContainer" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.608663 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} err="failed to get container status \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": rpc error: code = NotFound desc = could not find container \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": container with ID starting with 04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.608690 4778 scope.go:117] "RemoveContainer" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.609542 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107"} err="failed to get container status \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": rpc error: code = NotFound desc = could not find container \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": container with ID starting with 5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.609569 4778 scope.go:117] "RemoveContainer" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610045 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} err="failed to get container status \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": rpc error: code = NotFound desc = could not find container \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": container with ID starting with 236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610077 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610347 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} err="failed to get container status \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": rpc error: code = NotFound desc = could not find container \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": container with ID starting with 5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610372 4778 scope.go:117] "RemoveContainer" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610775 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} err="failed to get container status \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": rpc error: code = NotFound desc = could not find container \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": container with ID starting with 561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.610801 4778 scope.go:117] "RemoveContainer" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.611164 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} err="failed to get container status \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": rpc error: code = NotFound desc = could not find container \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": container with ID starting with 031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.611206 4778 scope.go:117] "RemoveContainer" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.611739 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} err="failed to get container status \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": rpc error: code = NotFound desc = could not find container \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": container with ID starting with bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.611765 4778 scope.go:117] "RemoveContainer" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.612069 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} err="failed to get container status \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": rpc error: code = NotFound desc = could not find container \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": container with ID starting with 94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.612111 4778 scope.go:117] "RemoveContainer" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.612552 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} err="failed to get container status \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": rpc error: code = NotFound desc = could not find container \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": container with ID starting with 42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.612583 4778 scope.go:117] "RemoveContainer" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.613152 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} err="failed to get container status \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": rpc error: code = NotFound desc = could not find container \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": container with ID starting with 8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.613184 4778 scope.go:117] "RemoveContainer" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614059 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} err="failed to get container status \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": rpc error: code = NotFound desc = could not find container \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": container with ID starting with 04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614105 4778 scope.go:117] "RemoveContainer" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614537 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107"} err="failed to get container status \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": rpc error: code = NotFound desc = could not find container \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": container with ID starting with 5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614562 4778 scope.go:117] "RemoveContainer" containerID="236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614957 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb"} err="failed to get container status \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": rpc error: code = NotFound desc = could not find container \"236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb\": container with ID starting with 236a604d849c39edc1a62c599b549316dfa42ca9103261416b95f51ad4c551cb not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.614980 4778 scope.go:117] "RemoveContainer" containerID="5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.616995 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d"} err="failed to get container status \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": rpc error: code = NotFound desc = could not find container \"5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d\": container with ID starting with 5bad1cb22c1d97889074ac0ca79149820ef55ec972d84248feca054a0d63aa0d not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.617024 4778 scope.go:117] "RemoveContainer" containerID="561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.618527 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f"} err="failed to get container status \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": rpc error: code = NotFound desc = could not find container \"561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f\": container with ID starting with 561c2e46240b45b9f24acee2f23aec8dca88c51b30e17113ddec493b61992d2f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.618618 4778 scope.go:117] "RemoveContainer" containerID="031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.619296 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147"} err="failed to get container status \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": rpc error: code = NotFound desc = could not find container \"031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147\": container with ID starting with 031a9f8697a5e47d732b9b0bb32376f4d7c4ac75445635459689059a22ff6147 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.619326 4778 scope.go:117] "RemoveContainer" containerID="bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.620171 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3"} err="failed to get container status \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": rpc error: code = NotFound desc = could not find container \"bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3\": container with ID starting with bf26f4a358cfc0558800f918a7a62701832485a943f1abc001e64c792045adc3 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.620402 4778 scope.go:117] "RemoveContainer" containerID="94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.620789 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b"} err="failed to get container status \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": rpc error: code = NotFound desc = could not find container \"94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b\": container with ID starting with 94ec3e19c366d470ea5da2a5cb4ce4645dbbf3f6c0ad8886ea9935f14510254b not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.620833 4778 scope.go:117] "RemoveContainer" containerID="42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.621267 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5"} err="failed to get container status \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": rpc error: code = NotFound desc = could not find container \"42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5\": container with ID starting with 42dbb6e4bd0f8eecd01e931cf43692378f53aa5cd2391ce0875794281e1ac2b5 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.621298 4778 scope.go:117] "RemoveContainer" containerID="8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.621609 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0"} err="failed to get container status \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": rpc error: code = NotFound desc = could not find container \"8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0\": container with ID starting with 8f4677e9b5d7940aae129734e1dd1a27fdf7d581e8359b5c62fa3eb53173e7a0 not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.621735 4778 scope.go:117] "RemoveContainer" containerID="04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.622211 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f"} err="failed to get container status \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": rpc error: code = NotFound desc = could not find container \"04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f\": container with ID starting with 04c5fe5a3eaa94483be1493fba7af6b1a06cd7f758bec9499e273efd0696146f not found: ID does not exist" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.622242 4778 scope.go:117] "RemoveContainer" containerID="5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107" Mar 18 09:15:54 crc kubenswrapper[4778]: I0318 09:15:54.622661 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107"} err="failed to get container status \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": rpc error: code = NotFound desc = could not find container \"5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107\": container with ID starting with 5a075c534652ee8293d3f891279d8790a30d547e8f8950e95f396f6ee3e3a107 not found: ID does not exist" Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.289753 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"bb9f0f2dbf78111ae61c32607460c52a5289f21c493f1d11b9f0136de240ace7"} Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.290148 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"b2762b2a8a9cf60abc3e1a47b5fcff9c1ac332c758dc5db6f836932298dc703e"} Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.290162 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"e88829a78b81baf658a9f6aa474c13edfa36d9293af979a5af3163ee7894111f"} Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.290229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"97fde475db8479dc5f6dd9b3f54e772b9f4851994e2501e2cd66c560c2e98472"} Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.290242 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"9aae53f4ee819d01522f1b867eefa3722b585d8ed0122d111c1ba18a2f29665b"} Mar 18 09:15:55 crc kubenswrapper[4778]: I0318 09:15:55.290254 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"bfaca16b71999384966203a18082f0cb7de1868a00c74057228bd44e3414e847"} Mar 18 09:15:56 crc kubenswrapper[4778]: I0318 09:15:56.199864 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef97d63e-1caf-44c9-ac0c-9b03dbd05113" path="/var/lib/kubelet/pods/ef97d63e-1caf-44c9-ac0c-9b03dbd05113/volumes" Mar 18 09:15:58 crc kubenswrapper[4778]: I0318 09:15:58.317049 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"6df40edb7a46800fdfba44f52a7229fa9eb0ec956d0deec9facb75e649abfd37"} Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.137582 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563756-s8bkt"] Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.138867 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.143632 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.143826 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.143996 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.147598 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.147693 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.216513 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lmf7\" (UniqueName: \"kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7\") pod \"auto-csr-approver-29563756-s8bkt\" (UID: \"a6298370-ed2e-4705-827b-c1a77b03f32a\") " pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.317790 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lmf7\" (UniqueName: \"kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7\") pod \"auto-csr-approver-29563756-s8bkt\" (UID: \"a6298370-ed2e-4705-827b-c1a77b03f32a\") " pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.339229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" event={"ID":"ffcc7085-b304-4ae4-a764-907d0ce857ea","Type":"ContainerStarted","Data":"f75ad8a40abb1f4d6e6a8fbaf0774fc78e6261a9216a78d79ac21cd4ac817e46"} Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.339677 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.339701 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.341624 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lmf7\" (UniqueName: \"kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7\") pod \"auto-csr-approver-29563756-s8bkt\" (UID: \"a6298370-ed2e-4705-827b-c1a77b03f32a\") " pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.369966 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.376295 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" podStartSLOduration=7.376278503 podStartE2EDuration="7.376278503s" podCreationTimestamp="2026-03-18 09:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:16:00.372507609 +0000 UTC m=+826.947252459" watchObservedRunningTime="2026-03-18 09:16:00.376278503 +0000 UTC m=+826.951023343" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.458391 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: E0318 09:16:00.487407 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(06f5228725be3d4a74527b764f3454a7538859a5ace127e774f22fdc891659f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:16:00 crc kubenswrapper[4778]: E0318 09:16:00.487510 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(06f5228725be3d4a74527b764f3454a7538859a5ace127e774f22fdc891659f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: E0318 09:16:00.487538 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(06f5228725be3d4a74527b764f3454a7538859a5ace127e774f22fdc891659f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:00 crc kubenswrapper[4778]: E0318 09:16:00.487606 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(06f5228725be3d4a74527b764f3454a7538859a5ace127e774f22fdc891659f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" Mar 18 09:16:00 crc kubenswrapper[4778]: I0318 09:16:00.572564 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-s8bkt"] Mar 18 09:16:01 crc kubenswrapper[4778]: I0318 09:16:01.344479 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:01 crc kubenswrapper[4778]: I0318 09:16:01.345019 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:01 crc kubenswrapper[4778]: I0318 09:16:01.345148 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:01 crc kubenswrapper[4778]: E0318 09:16:01.377753 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(00819e6d6edcaf45089b4400e517bd0e65957d461a51981cccb703db10967cb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:16:01 crc kubenswrapper[4778]: E0318 09:16:01.377841 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(00819e6d6edcaf45089b4400e517bd0e65957d461a51981cccb703db10967cb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:01 crc kubenswrapper[4778]: E0318 09:16:01.377879 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(00819e6d6edcaf45089b4400e517bd0e65957d461a51981cccb703db10967cb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:01 crc kubenswrapper[4778]: E0318 09:16:01.377940 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(00819e6d6edcaf45089b4400e517bd0e65957d461a51981cccb703db10967cb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" Mar 18 09:16:01 crc kubenswrapper[4778]: I0318 09:16:01.396112 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:04 crc kubenswrapper[4778]: I0318 09:16:04.192780 4778 scope.go:117] "RemoveContainer" containerID="f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0" Mar 18 09:16:04 crc kubenswrapper[4778]: E0318 09:16:04.193601 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-r2lvf_openshift-multus(dce973f3-25e6-4536-87cc-9b46499ad7cf)\"" pod="openshift-multus/multus-r2lvf" podUID="dce973f3-25e6-4536-87cc-9b46499ad7cf" Mar 18 09:16:12 crc kubenswrapper[4778]: I0318 09:16:12.187128 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:12 crc kubenswrapper[4778]: I0318 09:16:12.188589 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:12 crc kubenswrapper[4778]: E0318 09:16:12.243443 4778 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(5546cbe4db08d0c0e9da6321547de08fd0ab86f621eae69fd3a3dd13b707a887): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 09:16:12 crc kubenswrapper[4778]: E0318 09:16:12.243568 4778 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(5546cbe4db08d0c0e9da6321547de08fd0ab86f621eae69fd3a3dd13b707a887): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:12 crc kubenswrapper[4778]: E0318 09:16:12.243610 4778 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(5546cbe4db08d0c0e9da6321547de08fd0ab86f621eae69fd3a3dd13b707a887): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:12 crc kubenswrapper[4778]: E0318 09:16:12.243694 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29563756-s8bkt_openshift-infra(a6298370-ed2e-4705-827b-c1a77b03f32a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563756-s8bkt_openshift-infra_a6298370-ed2e-4705-827b-c1a77b03f32a_0(5546cbe4db08d0c0e9da6321547de08fd0ab86f621eae69fd3a3dd13b707a887): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" Mar 18 09:16:17 crc kubenswrapper[4778]: I0318 09:16:17.187089 4778 scope.go:117] "RemoveContainer" containerID="f3ed13c08ae99a4b04465da39cc132336b6c856e9f5e19d24c954fe658b51ed0" Mar 18 09:16:17 crc kubenswrapper[4778]: I0318 09:16:17.463939 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r2lvf_dce973f3-25e6-4536-87cc-9b46499ad7cf/kube-multus/2.log" Mar 18 09:16:17 crc kubenswrapper[4778]: I0318 09:16:17.464326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r2lvf" event={"ID":"dce973f3-25e6-4536-87cc-9b46499ad7cf","Type":"ContainerStarted","Data":"9bfe3dd19d78b636a423d27ec05bd54a506754859e25a7865f4f2ed5351ab160"} Mar 18 09:16:23 crc kubenswrapper[4778]: I0318 09:16:23.838537 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wj8wh" Mar 18 09:16:24 crc kubenswrapper[4778]: I0318 09:16:24.187003 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:24 crc kubenswrapper[4778]: I0318 09:16:24.191176 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:24 crc kubenswrapper[4778]: I0318 09:16:24.499766 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-s8bkt"] Mar 18 09:16:24 crc kubenswrapper[4778]: I0318 09:16:24.520849 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" event={"ID":"a6298370-ed2e-4705-827b-c1a77b03f32a","Type":"ContainerStarted","Data":"0e8064a1cd9a776938fe866aee052971920f428d338552bf9922173d39cb35da"} Mar 18 09:16:26 crc kubenswrapper[4778]: I0318 09:16:26.537126 4778 generic.go:334] "Generic (PLEG): container finished" podID="a6298370-ed2e-4705-827b-c1a77b03f32a" containerID="4909b98cff116d6eb4c151d4ba3b46f1a567c070f760a907d7a4e8ea4dca9196" exitCode=0 Mar 18 09:16:26 crc kubenswrapper[4778]: I0318 09:16:26.537328 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" event={"ID":"a6298370-ed2e-4705-827b-c1a77b03f32a","Type":"ContainerDied","Data":"4909b98cff116d6eb4c151d4ba3b46f1a567c070f760a907d7a4e8ea4dca9196"} Mar 18 09:16:27 crc kubenswrapper[4778]: I0318 09:16:27.864418 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.022159 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lmf7\" (UniqueName: \"kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7\") pod \"a6298370-ed2e-4705-827b-c1a77b03f32a\" (UID: \"a6298370-ed2e-4705-827b-c1a77b03f32a\") " Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.041100 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7" (OuterVolumeSpecName: "kube-api-access-5lmf7") pod "a6298370-ed2e-4705-827b-c1a77b03f32a" (UID: "a6298370-ed2e-4705-827b-c1a77b03f32a"). InnerVolumeSpecName "kube-api-access-5lmf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.124384 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lmf7\" (UniqueName: \"kubernetes.io/projected/a6298370-ed2e-4705-827b-c1a77b03f32a-kube-api-access-5lmf7\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.553279 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" event={"ID":"a6298370-ed2e-4705-827b-c1a77b03f32a","Type":"ContainerDied","Data":"0e8064a1cd9a776938fe866aee052971920f428d338552bf9922173d39cb35da"} Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.553333 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e8064a1cd9a776938fe866aee052971920f428d338552bf9922173d39cb35da" Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.553404 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-s8bkt" Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.941063 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-lv4gn"] Mar 18 09:16:28 crc kubenswrapper[4778]: I0318 09:16:28.944287 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-lv4gn"] Mar 18 09:16:30 crc kubenswrapper[4778]: I0318 09:16:30.147709 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:16:30 crc kubenswrapper[4778]: I0318 09:16:30.147797 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:16:30 crc kubenswrapper[4778]: I0318 09:16:30.201074 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="105b6b5d-09f6-48c8-862e-c17526c6d6c7" path="/var/lib/kubelet/pods/105b6b5d-09f6-48c8-862e-c17526c6d6c7/volumes" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.326104 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49"] Mar 18 09:16:34 crc kubenswrapper[4778]: E0318 09:16:34.326990 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" containerName="oc" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.327005 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" containerName="oc" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.327123 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" containerName="oc" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.328187 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.330580 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.338923 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49"] Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.408860 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.408969 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q4fc\" (UniqueName: \"kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.409008 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.510240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q4fc\" (UniqueName: \"kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.510551 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.510599 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.511279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.511279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.531444 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q4fc\" (UniqueName: \"kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.685469 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:34 crc kubenswrapper[4778]: I0318 09:16:34.908156 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49"] Mar 18 09:16:35 crc kubenswrapper[4778]: I0318 09:16:35.603877 4778 generic.go:334] "Generic (PLEG): container finished" podID="85a942ea-cebf-408c-95b8-f435630b20ad" containerID="fb110c6531d7f01f64ebf0f6e90cfe932435a67f8179032a89235f6875dd767f" exitCode=0 Mar 18 09:16:35 crc kubenswrapper[4778]: I0318 09:16:35.603962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" event={"ID":"85a942ea-cebf-408c-95b8-f435630b20ad","Type":"ContainerDied","Data":"fb110c6531d7f01f64ebf0f6e90cfe932435a67f8179032a89235f6875dd767f"} Mar 18 09:16:35 crc kubenswrapper[4778]: I0318 09:16:35.604048 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" event={"ID":"85a942ea-cebf-408c-95b8-f435630b20ad","Type":"ContainerStarted","Data":"9f04029b414c3cb7688ae4afc622f10363ab38f8bb293083d74ffa8e39919aeb"} Mar 18 09:16:37 crc kubenswrapper[4778]: I0318 09:16:37.621778 4778 generic.go:334] "Generic (PLEG): container finished" podID="85a942ea-cebf-408c-95b8-f435630b20ad" containerID="d4b5149511ad14259bdd54f43fd9a3428baa75e9a79cb380d5bd48c26a00b834" exitCode=0 Mar 18 09:16:37 crc kubenswrapper[4778]: I0318 09:16:37.621855 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" event={"ID":"85a942ea-cebf-408c-95b8-f435630b20ad","Type":"ContainerDied","Data":"d4b5149511ad14259bdd54f43fd9a3428baa75e9a79cb380d5bd48c26a00b834"} Mar 18 09:16:38 crc kubenswrapper[4778]: I0318 09:16:38.632410 4778 generic.go:334] "Generic (PLEG): container finished" podID="85a942ea-cebf-408c-95b8-f435630b20ad" containerID="368bbf8e766766ba5c0481c8ff55994d3b55ac4a07f2ee40883fbee6779c77a0" exitCode=0 Mar 18 09:16:38 crc kubenswrapper[4778]: I0318 09:16:38.632473 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" event={"ID":"85a942ea-cebf-408c-95b8-f435630b20ad","Type":"ContainerDied","Data":"368bbf8e766766ba5c0481c8ff55994d3b55ac4a07f2ee40883fbee6779c77a0"} Mar 18 09:16:39 crc kubenswrapper[4778]: I0318 09:16:39.865037 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:39 crc kubenswrapper[4778]: I0318 09:16:39.997522 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle\") pod \"85a942ea-cebf-408c-95b8-f435630b20ad\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " Mar 18 09:16:39 crc kubenswrapper[4778]: I0318 09:16:39.997728 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util\") pod \"85a942ea-cebf-408c-95b8-f435630b20ad\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " Mar 18 09:16:39 crc kubenswrapper[4778]: I0318 09:16:39.997842 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q4fc\" (UniqueName: \"kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc\") pod \"85a942ea-cebf-408c-95b8-f435630b20ad\" (UID: \"85a942ea-cebf-408c-95b8-f435630b20ad\") " Mar 18 09:16:39 crc kubenswrapper[4778]: I0318 09:16:39.999451 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle" (OuterVolumeSpecName: "bundle") pod "85a942ea-cebf-408c-95b8-f435630b20ad" (UID: "85a942ea-cebf-408c-95b8-f435630b20ad"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.007691 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc" (OuterVolumeSpecName: "kube-api-access-9q4fc") pod "85a942ea-cebf-408c-95b8-f435630b20ad" (UID: "85a942ea-cebf-408c-95b8-f435630b20ad"). InnerVolumeSpecName "kube-api-access-9q4fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.018763 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util" (OuterVolumeSpecName: "util") pod "85a942ea-cebf-408c-95b8-f435630b20ad" (UID: "85a942ea-cebf-408c-95b8-f435630b20ad"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.099386 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.099439 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85a942ea-cebf-408c-95b8-f435630b20ad-util\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.099457 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q4fc\" (UniqueName: \"kubernetes.io/projected/85a942ea-cebf-408c-95b8-f435630b20ad-kube-api-access-9q4fc\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.648980 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" event={"ID":"85a942ea-cebf-408c-95b8-f435630b20ad","Type":"ContainerDied","Data":"9f04029b414c3cb7688ae4afc622f10363ab38f8bb293083d74ffa8e39919aeb"} Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.649041 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f04029b414c3cb7688ae4afc622f10363ab38f8bb293083d74ffa8e39919aeb" Mar 18 09:16:40 crc kubenswrapper[4778]: I0318 09:16:40.649116 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.977550 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls"] Mar 18 09:16:41 crc kubenswrapper[4778]: E0318 09:16:41.978403 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="util" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.978422 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="util" Mar 18 09:16:41 crc kubenswrapper[4778]: E0318 09:16:41.978439 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="pull" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.978448 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="pull" Mar 18 09:16:41 crc kubenswrapper[4778]: E0318 09:16:41.978468 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="extract" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.978475 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="extract" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.978583 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a942ea-cebf-408c-95b8-f435630b20ad" containerName="extract" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.979241 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.981937 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.982983 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 09:16:41 crc kubenswrapper[4778]: I0318 09:16:41.983044 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-n2vhk" Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.001792 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls"] Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.127693 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cz7p\" (UniqueName: \"kubernetes.io/projected/1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe-kube-api-access-2cz7p\") pod \"nmstate-operator-796d4cfff4-sr9ls\" (UID: \"1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.228581 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cz7p\" (UniqueName: \"kubernetes.io/projected/1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe-kube-api-access-2cz7p\") pod \"nmstate-operator-796d4cfff4-sr9ls\" (UID: \"1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.252565 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cz7p\" (UniqueName: \"kubernetes.io/projected/1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe-kube-api-access-2cz7p\") pod \"nmstate-operator-796d4cfff4-sr9ls\" (UID: \"1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.294121 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.537831 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls"] Mar 18 09:16:42 crc kubenswrapper[4778]: W0318 09:16:42.548508 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dd7ccb2_0dca_4a6d_87f7_195b0ae0f9fe.slice/crio-45060fd7497241cce12ea1e0c6b83efe4318493b630a7487083e5183a64e385d WatchSource:0}: Error finding container 45060fd7497241cce12ea1e0c6b83efe4318493b630a7487083e5183a64e385d: Status 404 returned error can't find the container with id 45060fd7497241cce12ea1e0c6b83efe4318493b630a7487083e5183a64e385d Mar 18 09:16:42 crc kubenswrapper[4778]: I0318 09:16:42.660731 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" event={"ID":"1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe","Type":"ContainerStarted","Data":"45060fd7497241cce12ea1e0c6b83efe4318493b630a7487083e5183a64e385d"} Mar 18 09:16:43 crc kubenswrapper[4778]: I0318 09:16:43.047571 4778 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 09:16:54 crc kubenswrapper[4778]: I0318 09:16:54.741462 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" event={"ID":"1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe","Type":"ContainerStarted","Data":"272b1b4d0558016cc87acf481ff47c38931f2048212373207ec3f01d3d0545d5"} Mar 18 09:16:54 crc kubenswrapper[4778]: I0318 09:16:54.769258 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sr9ls" podStartSLOduration=2.251525774 podStartE2EDuration="13.769239495s" podCreationTimestamp="2026-03-18 09:16:41 +0000 UTC" firstStartedPulling="2026-03-18 09:16:42.551857892 +0000 UTC m=+869.126602732" lastFinishedPulling="2026-03-18 09:16:54.069571613 +0000 UTC m=+880.644316453" observedRunningTime="2026-03-18 09:16:54.766678105 +0000 UTC m=+881.341422975" watchObservedRunningTime="2026-03-18 09:16:54.769239495 +0000 UTC m=+881.343984345" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.721470 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.722248 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.725310 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-pnzh2" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.737901 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-thw7f"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.739523 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.742359 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.754559 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.765655 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-thw7f"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.792811 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-5thsf"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.794499 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.834125 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc9zm\" (UniqueName: \"kubernetes.io/projected/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-kube-api-access-kc9zm\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.834167 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.834190 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klz7l\" (UniqueName: \"kubernetes.io/projected/71b50b27-6084-4693-acbc-d14f36759618-kube-api-access-klz7l\") pod \"nmstate-metrics-9b8c8685d-wq8gr\" (UID: \"71b50b27-6084-4693-acbc-d14f36759618\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.886761 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.887672 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.890281 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6r2fb" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.890440 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.890656 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.900414 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p"] Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935460 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf9dr\" (UniqueName: \"kubernetes.io/projected/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-kube-api-access-kf9dr\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935600 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-ovs-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935681 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-nmstate-lock\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935753 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc9zm\" (UniqueName: \"kubernetes.io/projected/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-kube-api-access-kc9zm\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935784 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-dbus-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935814 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.935835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klz7l\" (UniqueName: \"kubernetes.io/projected/71b50b27-6084-4693-acbc-d14f36759618-kube-api-access-klz7l\") pod \"nmstate-metrics-9b8c8685d-wq8gr\" (UID: \"71b50b27-6084-4693-acbc-d14f36759618\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" Mar 18 09:16:55 crc kubenswrapper[4778]: E0318 09:16:55.936057 4778 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 18 09:16:55 crc kubenswrapper[4778]: E0318 09:16:55.936106 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair podName:5961b98d-a41a-4ceb-bb71-4bf3a0fc854d nodeName:}" failed. No retries permitted until 2026-03-18 09:16:56.436090212 +0000 UTC m=+883.010835052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair") pod "nmstate-webhook-5f558f5558-thw7f" (UID: "5961b98d-a41a-4ceb-bb71-4bf3a0fc854d") : secret "openshift-nmstate-webhook" not found Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.959496 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klz7l\" (UniqueName: \"kubernetes.io/projected/71b50b27-6084-4693-acbc-d14f36759618-kube-api-access-klz7l\") pod \"nmstate-metrics-9b8c8685d-wq8gr\" (UID: \"71b50b27-6084-4693-acbc-d14f36759618\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" Mar 18 09:16:55 crc kubenswrapper[4778]: I0318 09:16:55.961849 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc9zm\" (UniqueName: \"kubernetes.io/projected/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-kube-api-access-kc9zm\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.037734 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b636ef7-4b85-4506-bb2a-f89bee9b028d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.037835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-ovs-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.037932 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.037986 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-nmstate-lock\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038053 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-nmstate-lock\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.037982 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-ovs-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-dbus-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038374 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf9dr\" (UniqueName: \"kubernetes.io/projected/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-kube-api-access-kf9dr\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038467 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klxq2\" (UniqueName: \"kubernetes.io/projected/8b636ef7-4b85-4506-bb2a-f89bee9b028d-kube-api-access-klxq2\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.038922 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-dbus-socket\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.065140 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf9dr\" (UniqueName: \"kubernetes.io/projected/5b97fa25-4d3d-4664-a5fc-41c98bbd272f-kube-api-access-kf9dr\") pod \"nmstate-handler-5thsf\" (UID: \"5b97fa25-4d3d-4664-a5fc-41c98bbd272f\") " pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.098832 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-65b6b6f7c5-nrwfx"] Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.109817 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.119132 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.120737 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65b6b6f7c5-nrwfx"] Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.139835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klxq2\" (UniqueName: \"kubernetes.io/projected/8b636ef7-4b85-4506-bb2a-f89bee9b028d-kube-api-access-klxq2\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.140128 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b636ef7-4b85-4506-bb2a-f89bee9b028d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.140284 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: E0318 09:16:56.140512 4778 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 18 09:16:56 crc kubenswrapper[4778]: E0318 09:16:56.140661 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert podName:8b636ef7-4b85-4506-bb2a-f89bee9b028d nodeName:}" failed. No retries permitted until 2026-03-18 09:16:56.640640583 +0000 UTC m=+883.215385433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-22c9p" (UID: "8b636ef7-4b85-4506-bb2a-f89bee9b028d") : secret "plugin-serving-cert" not found Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.141095 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b636ef7-4b85-4506-bb2a-f89bee9b028d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.156820 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klxq2\" (UniqueName: \"kubernetes.io/projected/8b636ef7-4b85-4506-bb2a-f89bee9b028d-kube-api-access-klxq2\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241128 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-oauth-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241174 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241229 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241250 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsnvx\" (UniqueName: \"kubernetes.io/projected/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-kube-api-access-wsnvx\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241297 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-oauth-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241329 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-trusted-ca-bundle\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.241375 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-service-ca\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.260618 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr"] Mar 18 09:16:56 crc kubenswrapper[4778]: W0318 09:16:56.267791 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b50b27_6084_4693_acbc_d14f36759618.slice/crio-ebf7f4cde51e95f80ba790fac53e82acf099db419c398861f8469f6cc399183f WatchSource:0}: Error finding container ebf7f4cde51e95f80ba790fac53e82acf099db419c398861f8469f6cc399183f: Status 404 returned error can't find the container with id ebf7f4cde51e95f80ba790fac53e82acf099db419c398861f8469f6cc399183f Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.343428 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-oauth-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.343845 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.343906 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.343949 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsnvx\" (UniqueName: \"kubernetes.io/projected/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-kube-api-access-wsnvx\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.344103 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-oauth-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.344180 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-trusted-ca-bundle\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.344351 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-service-ca\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.344371 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-oauth-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.346246 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.346718 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-service-ca\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.347933 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-trusted-ca-bundle\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.350122 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-serving-cert\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.350692 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-console-oauth-config\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.361947 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsnvx\" (UniqueName: \"kubernetes.io/projected/6bb4ece6-9324-4551-9be5-b0d2f6b6d597-kube-api-access-wsnvx\") pod \"console-65b6b6f7c5-nrwfx\" (UID: \"6bb4ece6-9324-4551-9be5-b0d2f6b6d597\") " pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.438773 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.446073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.450759 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5961b98d-a41a-4ceb-bb71-4bf3a0fc854d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-thw7f\" (UID: \"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.649305 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.654515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.655750 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b636ef7-4b85-4506-bb2a-f89bee9b028d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-22c9p\" (UID: \"8b636ef7-4b85-4506-bb2a-f89bee9b028d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.720943 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65b6b6f7c5-nrwfx"] Mar 18 09:16:56 crc kubenswrapper[4778]: W0318 09:16:56.734590 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bb4ece6_9324_4551_9be5_b0d2f6b6d597.slice/crio-f6727198f91acacfa1507ae495cc6a99e7e02db7d492d388fe5ee3314cd21064 WatchSource:0}: Error finding container f6727198f91acacfa1507ae495cc6a99e7e02db7d492d388fe5ee3314cd21064: Status 404 returned error can't find the container with id f6727198f91acacfa1507ae495cc6a99e7e02db7d492d388fe5ee3314cd21064 Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.762857 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" event={"ID":"71b50b27-6084-4693-acbc-d14f36759618","Type":"ContainerStarted","Data":"ebf7f4cde51e95f80ba790fac53e82acf099db419c398861f8469f6cc399183f"} Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.763621 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65b6b6f7c5-nrwfx" event={"ID":"6bb4ece6-9324-4551-9be5-b0d2f6b6d597","Type":"ContainerStarted","Data":"f6727198f91acacfa1507ae495cc6a99e7e02db7d492d388fe5ee3314cd21064"} Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.766146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5thsf" event={"ID":"5b97fa25-4d3d-4664-a5fc-41c98bbd272f","Type":"ContainerStarted","Data":"477e73f9bcf200c7dcf2019d15261cf8e232ed521ecf42122cc54af0613cf434"} Mar 18 09:16:56 crc kubenswrapper[4778]: I0318 09:16:56.801630 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.019380 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p"] Mar 18 09:16:57 crc kubenswrapper[4778]: W0318 09:16:57.027112 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b636ef7_4b85_4506_bb2a_f89bee9b028d.slice/crio-46b4fb23f770c5e0c182ef28a909880470ccecab4e3a37c9ba0ef3f9406217be WatchSource:0}: Error finding container 46b4fb23f770c5e0c182ef28a909880470ccecab4e3a37c9ba0ef3f9406217be: Status 404 returned error can't find the container with id 46b4fb23f770c5e0c182ef28a909880470ccecab4e3a37c9ba0ef3f9406217be Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.087730 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-thw7f"] Mar 18 09:16:57 crc kubenswrapper[4778]: W0318 09:16:57.090336 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5961b98d_a41a_4ceb_bb71_4bf3a0fc854d.slice/crio-a90915ed5b06b495028e5cd1931c6c45ba6b78135e8fa803a5fd2b12d02d6d62 WatchSource:0}: Error finding container a90915ed5b06b495028e5cd1931c6c45ba6b78135e8fa803a5fd2b12d02d6d62: Status 404 returned error can't find the container with id a90915ed5b06b495028e5cd1931c6c45ba6b78135e8fa803a5fd2b12d02d6d62 Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.775455 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" event={"ID":"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d","Type":"ContainerStarted","Data":"a90915ed5b06b495028e5cd1931c6c45ba6b78135e8fa803a5fd2b12d02d6d62"} Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.777756 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65b6b6f7c5-nrwfx" event={"ID":"6bb4ece6-9324-4551-9be5-b0d2f6b6d597","Type":"ContainerStarted","Data":"1c806c27ee9ca32e9f9d0916a8675121329938c0c4c9193173623a419df71be7"} Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.780005 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" event={"ID":"8b636ef7-4b85-4506-bb2a-f89bee9b028d","Type":"ContainerStarted","Data":"46b4fb23f770c5e0c182ef28a909880470ccecab4e3a37c9ba0ef3f9406217be"} Mar 18 09:16:57 crc kubenswrapper[4778]: I0318 09:16:57.812958 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65b6b6f7c5-nrwfx" podStartSLOduration=1.8128924130000001 podStartE2EDuration="1.812892413s" podCreationTimestamp="2026-03-18 09:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:16:57.808495762 +0000 UTC m=+884.383240672" watchObservedRunningTime="2026-03-18 09:16:57.812892413 +0000 UTC m=+884.387637283" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.148101 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.148447 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.148487 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.148973 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.149019 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d" gracePeriod=600 Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.803856 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5thsf" event={"ID":"5b97fa25-4d3d-4664-a5fc-41c98bbd272f","Type":"ContainerStarted","Data":"b3cf535dc0116b62d95f2699f650d59453ac02ed65b84a1ce82a040cda162d34"} Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.804260 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.806018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" event={"ID":"8b636ef7-4b85-4506-bb2a-f89bee9b028d","Type":"ContainerStarted","Data":"0da7043a51a86c0b9a2a6c0f67cd3b54d94b5026b00e290015d06efa67915cb0"} Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.811119 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d" exitCode=0 Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.811185 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d"} Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.811233 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38"} Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.811258 4778 scope.go:117] "RemoveContainer" containerID="8352df71b83dce7216f889f9357e2f46d95d02444a6764ed6ea4a1748338a832" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.813024 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" event={"ID":"5961b98d-a41a-4ceb-bb71-4bf3a0fc854d","Type":"ContainerStarted","Data":"1b8f72f0fe13159246a3839933c60ea581e784f55a5364f4519f5e862ba599aa"} Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.813241 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.851034 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-5thsf" podStartSLOduration=2.039830937 podStartE2EDuration="5.851007877s" podCreationTimestamp="2026-03-18 09:16:55 +0000 UTC" firstStartedPulling="2026-03-18 09:16:56.143141922 +0000 UTC m=+882.717886762" lastFinishedPulling="2026-03-18 09:16:59.954318842 +0000 UTC m=+886.529063702" observedRunningTime="2026-03-18 09:17:00.825414596 +0000 UTC m=+887.400159446" watchObservedRunningTime="2026-03-18 09:17:00.851007877 +0000 UTC m=+887.425752727" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.868402 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-22c9p" podStartSLOduration=2.9408840229999997 podStartE2EDuration="5.868379094s" podCreationTimestamp="2026-03-18 09:16:55 +0000 UTC" firstStartedPulling="2026-03-18 09:16:57.029647168 +0000 UTC m=+883.604392008" lastFinishedPulling="2026-03-18 09:16:59.957142229 +0000 UTC m=+886.531887079" observedRunningTime="2026-03-18 09:17:00.86496016 +0000 UTC m=+887.439705050" watchObservedRunningTime="2026-03-18 09:17:00.868379094 +0000 UTC m=+887.443123944" Mar 18 09:17:00 crc kubenswrapper[4778]: I0318 09:17:00.888350 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" podStartSLOduration=3.024883796 podStartE2EDuration="5.88830957s" podCreationTimestamp="2026-03-18 09:16:55 +0000 UTC" firstStartedPulling="2026-03-18 09:16:57.09388736 +0000 UTC m=+883.668632210" lastFinishedPulling="2026-03-18 09:16:59.957313094 +0000 UTC m=+886.532057984" observedRunningTime="2026-03-18 09:17:00.882804229 +0000 UTC m=+887.457549079" watchObservedRunningTime="2026-03-18 09:17:00.88830957 +0000 UTC m=+887.463054420" Mar 18 09:17:01 crc kubenswrapper[4778]: I0318 09:17:01.532354 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:17:01 crc kubenswrapper[4778]: I0318 09:17:01.823579 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" event={"ID":"71b50b27-6084-4693-acbc-d14f36759618","Type":"ContainerStarted","Data":"e98d8d4ce2b85174d6090313f532e7c6f2a36c24e05b766a90f3aa888ba6301f"} Mar 18 09:17:04 crc kubenswrapper[4778]: I0318 09:17:04.889613 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" event={"ID":"71b50b27-6084-4693-acbc-d14f36759618","Type":"ContainerStarted","Data":"8bf22161969d45f052561ed6e4783da26e5240e98bbcbfef9362220487edf528"} Mar 18 09:17:04 crc kubenswrapper[4778]: I0318 09:17:04.909677 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wq8gr" podStartSLOduration=2.061117241 podStartE2EDuration="9.909657236s" podCreationTimestamp="2026-03-18 09:16:55 +0000 UTC" firstStartedPulling="2026-03-18 09:16:56.273531008 +0000 UTC m=+882.848275848" lastFinishedPulling="2026-03-18 09:17:04.122071003 +0000 UTC m=+890.696815843" observedRunningTime="2026-03-18 09:17:04.907065225 +0000 UTC m=+891.481810155" watchObservedRunningTime="2026-03-18 09:17:04.909657236 +0000 UTC m=+891.484402086" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.152763 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-5thsf" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.439931 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.440227 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.448873 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.913545 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65b6b6f7c5-nrwfx" Mar 18 09:17:06 crc kubenswrapper[4778]: I0318 09:17:06.982608 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.796377 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.798016 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.812416 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.834976 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.835027 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmrmm\" (UniqueName: \"kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.835399 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.937459 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.938253 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.939938 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.940542 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.941006 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmrmm\" (UniqueName: \"kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:07 crc kubenswrapper[4778]: I0318 09:17:07.964759 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmrmm\" (UniqueName: \"kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm\") pod \"redhat-marketplace-cbbb4\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:08 crc kubenswrapper[4778]: I0318 09:17:08.116063 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:08 crc kubenswrapper[4778]: I0318 09:17:08.360130 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:08 crc kubenswrapper[4778]: W0318 09:17:08.369237 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode437a76c_331a_422b_aafb_036febcf9e98.slice/crio-9dc755ad4dc74d5e7bbf029e4b451ae2af505c3ba4a46b3dc50587de84cf2990 WatchSource:0}: Error finding container 9dc755ad4dc74d5e7bbf029e4b451ae2af505c3ba4a46b3dc50587de84cf2990: Status 404 returned error can't find the container with id 9dc755ad4dc74d5e7bbf029e4b451ae2af505c3ba4a46b3dc50587de84cf2990 Mar 18 09:17:08 crc kubenswrapper[4778]: I0318 09:17:08.922724 4778 generic.go:334] "Generic (PLEG): container finished" podID="e437a76c-331a-422b-aafb-036febcf9e98" containerID="eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd" exitCode=0 Mar 18 09:17:08 crc kubenswrapper[4778]: I0318 09:17:08.922767 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerDied","Data":"eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd"} Mar 18 09:17:08 crc kubenswrapper[4778]: I0318 09:17:08.922791 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerStarted","Data":"9dc755ad4dc74d5e7bbf029e4b451ae2af505c3ba4a46b3dc50587de84cf2990"} Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.188401 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.190589 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.219401 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.301608 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.301728 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.301791 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pn5q\" (UniqueName: \"kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.403177 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.403255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.403289 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pn5q\" (UniqueName: \"kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.404329 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.404376 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.427096 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pn5q\" (UniqueName: \"kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q\") pod \"certified-operators-sxg2m\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.535301 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.775932 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.946922 4778 generic.go:334] "Generic (PLEG): container finished" podID="e437a76c-331a-422b-aafb-036febcf9e98" containerID="3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09" exitCode=0 Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.947035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerDied","Data":"3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09"} Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.948249 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerStarted","Data":"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c"} Mar 18 09:17:11 crc kubenswrapper[4778]: I0318 09:17:11.948290 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerStarted","Data":"1e5d40c4001d2f6677961b892e3d7f1ae3ee8cd89360a0d7ba77365018ec7e6f"} Mar 18 09:17:12 crc kubenswrapper[4778]: I0318 09:17:12.957680 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerStarted","Data":"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588"} Mar 18 09:17:12 crc kubenswrapper[4778]: I0318 09:17:12.960088 4778 generic.go:334] "Generic (PLEG): container finished" podID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerID="ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c" exitCode=0 Mar 18 09:17:12 crc kubenswrapper[4778]: I0318 09:17:12.960116 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerDied","Data":"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c"} Mar 18 09:17:12 crc kubenswrapper[4778]: I0318 09:17:12.983627 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cbbb4" podStartSLOduration=2.547354857 podStartE2EDuration="5.983606663s" podCreationTimestamp="2026-03-18 09:17:07 +0000 UTC" firstStartedPulling="2026-03-18 09:17:08.924556044 +0000 UTC m=+895.499300874" lastFinishedPulling="2026-03-18 09:17:12.36080784 +0000 UTC m=+898.935552680" observedRunningTime="2026-03-18 09:17:12.979936953 +0000 UTC m=+899.554681813" watchObservedRunningTime="2026-03-18 09:17:12.983606663 +0000 UTC m=+899.558351513" Mar 18 09:17:13 crc kubenswrapper[4778]: I0318 09:17:13.969164 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerStarted","Data":"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec"} Mar 18 09:17:14 crc kubenswrapper[4778]: I0318 09:17:14.981825 4778 generic.go:334] "Generic (PLEG): container finished" podID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerID="a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec" exitCode=0 Mar 18 09:17:14 crc kubenswrapper[4778]: I0318 09:17:14.981910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerDied","Data":"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec"} Mar 18 09:17:16 crc kubenswrapper[4778]: I0318 09:17:15.999125 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerStarted","Data":"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187"} Mar 18 09:17:16 crc kubenswrapper[4778]: I0318 09:17:16.033179 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sxg2m" podStartSLOduration=2.595827277 podStartE2EDuration="5.033151412s" podCreationTimestamp="2026-03-18 09:17:11 +0000 UTC" firstStartedPulling="2026-03-18 09:17:12.96162923 +0000 UTC m=+899.536374070" lastFinishedPulling="2026-03-18 09:17:15.398953325 +0000 UTC m=+901.973698205" observedRunningTime="2026-03-18 09:17:16.025512982 +0000 UTC m=+902.600257912" watchObservedRunningTime="2026-03-18 09:17:16.033151412 +0000 UTC m=+902.607896292" Mar 18 09:17:16 crc kubenswrapper[4778]: I0318 09:17:16.663056 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-thw7f" Mar 18 09:17:18 crc kubenswrapper[4778]: I0318 09:17:18.116566 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:18 crc kubenswrapper[4778]: I0318 09:17:18.116665 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:18 crc kubenswrapper[4778]: I0318 09:17:18.186020 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:19 crc kubenswrapper[4778]: I0318 09:17:19.086514 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:19 crc kubenswrapper[4778]: I0318 09:17:19.372535 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:19 crc kubenswrapper[4778]: I0318 09:17:19.788950 4778 scope.go:117] "RemoveContainer" containerID="c5bd546fb47bde264ad4459aced4ba49381ccd5bb127c64ac227483b8bb621c0" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.032180 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cbbb4" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="registry-server" containerID="cri-o://a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588" gracePeriod=2 Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.505152 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.535540 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.535647 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.558648 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content\") pod \"e437a76c-331a-422b-aafb-036febcf9e98\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.559471 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmrmm\" (UniqueName: \"kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm\") pod \"e437a76c-331a-422b-aafb-036febcf9e98\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.559535 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities\") pod \"e437a76c-331a-422b-aafb-036febcf9e98\" (UID: \"e437a76c-331a-422b-aafb-036febcf9e98\") " Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.561730 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities" (OuterVolumeSpecName: "utilities") pod "e437a76c-331a-422b-aafb-036febcf9e98" (UID: "e437a76c-331a-422b-aafb-036febcf9e98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.570225 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm" (OuterVolumeSpecName: "kube-api-access-bmrmm") pod "e437a76c-331a-422b-aafb-036febcf9e98" (UID: "e437a76c-331a-422b-aafb-036febcf9e98"). InnerVolumeSpecName "kube-api-access-bmrmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.607368 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.610279 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e437a76c-331a-422b-aafb-036febcf9e98" (UID: "e437a76c-331a-422b-aafb-036febcf9e98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.662519 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.662567 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmrmm\" (UniqueName: \"kubernetes.io/projected/e437a76c-331a-422b-aafb-036febcf9e98-kube-api-access-bmrmm\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:21 crc kubenswrapper[4778]: I0318 09:17:21.662586 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437a76c-331a-422b-aafb-036febcf9e98-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.044008 4778 generic.go:334] "Generic (PLEG): container finished" podID="e437a76c-331a-422b-aafb-036febcf9e98" containerID="a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588" exitCode=0 Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.044127 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerDied","Data":"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588"} Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.044112 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbbb4" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.044219 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbbb4" event={"ID":"e437a76c-331a-422b-aafb-036febcf9e98","Type":"ContainerDied","Data":"9dc755ad4dc74d5e7bbf029e4b451ae2af505c3ba4a46b3dc50587de84cf2990"} Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.044255 4778 scope.go:117] "RemoveContainer" containerID="a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.072001 4778 scope.go:117] "RemoveContainer" containerID="3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.085405 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.089744 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbbb4"] Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.122271 4778 scope.go:117] "RemoveContainer" containerID="eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.153898 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.166037 4778 scope.go:117] "RemoveContainer" containerID="a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588" Mar 18 09:17:22 crc kubenswrapper[4778]: E0318 09:17:22.168619 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588\": container with ID starting with a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588 not found: ID does not exist" containerID="a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.168667 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588"} err="failed to get container status \"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588\": rpc error: code = NotFound desc = could not find container \"a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588\": container with ID starting with a59ffd9bca5facfbc40ba7cf2e39ab6b2254506ee7ba7d209f7123e982ff6588 not found: ID does not exist" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.168713 4778 scope.go:117] "RemoveContainer" containerID="3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09" Mar 18 09:17:22 crc kubenswrapper[4778]: E0318 09:17:22.169082 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09\": container with ID starting with 3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09 not found: ID does not exist" containerID="3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.169142 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09"} err="failed to get container status \"3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09\": rpc error: code = NotFound desc = could not find container \"3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09\": container with ID starting with 3ab459c5a84489cc0cf575ebc99741f36b71731a69a3a44ff6e1e908620bdd09 not found: ID does not exist" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.169174 4778 scope.go:117] "RemoveContainer" containerID="eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd" Mar 18 09:17:22 crc kubenswrapper[4778]: E0318 09:17:22.169600 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd\": container with ID starting with eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd not found: ID does not exist" containerID="eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.169633 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd"} err="failed to get container status \"eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd\": rpc error: code = NotFound desc = could not find container \"eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd\": container with ID starting with eb53657ad8c522a1512ef587e4bf9fb4d63a0ed9a8347cd7d191325b4cade7fd not found: ID does not exist" Mar 18 09:17:22 crc kubenswrapper[4778]: I0318 09:17:22.194400 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e437a76c-331a-422b-aafb-036febcf9e98" path="/var/lib/kubelet/pods/e437a76c-331a-422b-aafb-036febcf9e98/volumes" Mar 18 09:17:23 crc kubenswrapper[4778]: I0318 09:17:23.371984 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.058697 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sxg2m" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="registry-server" containerID="cri-o://9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187" gracePeriod=2 Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.573854 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.606532 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content\") pod \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.606581 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities\") pod \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.606697 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pn5q\" (UniqueName: \"kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q\") pod \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\" (UID: \"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d\") " Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.608087 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities" (OuterVolumeSpecName: "utilities") pod "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" (UID: "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.615625 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q" (OuterVolumeSpecName: "kube-api-access-6pn5q") pod "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" (UID: "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d"). InnerVolumeSpecName "kube-api-access-6pn5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.707539 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:24 crc kubenswrapper[4778]: I0318 09:17:24.707574 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pn5q\" (UniqueName: \"kubernetes.io/projected/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-kube-api-access-6pn5q\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.068364 4778 generic.go:334] "Generic (PLEG): container finished" podID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerID="9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187" exitCode=0 Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.068444 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerDied","Data":"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187"} Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.068520 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxg2m" event={"ID":"dc0f993f-9cd4-479c-b61f-e2eed4a69c2d","Type":"ContainerDied","Data":"1e5d40c4001d2f6677961b892e3d7f1ae3ee8cd89360a0d7ba77365018ec7e6f"} Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.068587 4778 scope.go:117] "RemoveContainer" containerID="9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.068613 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxg2m" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.099720 4778 scope.go:117] "RemoveContainer" containerID="a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.122899 4778 scope.go:117] "RemoveContainer" containerID="ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.165118 4778 scope.go:117] "RemoveContainer" containerID="9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187" Mar 18 09:17:25 crc kubenswrapper[4778]: E0318 09:17:25.165788 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187\": container with ID starting with 9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187 not found: ID does not exist" containerID="9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.165842 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187"} err="failed to get container status \"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187\": rpc error: code = NotFound desc = could not find container \"9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187\": container with ID starting with 9faaf67e44801c68a3efe2cd311f6c9922480f97b822f29e82d063f4a93b0187 not found: ID does not exist" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.165879 4778 scope.go:117] "RemoveContainer" containerID="a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec" Mar 18 09:17:25 crc kubenswrapper[4778]: E0318 09:17:25.166620 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec\": container with ID starting with a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec not found: ID does not exist" containerID="a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.166680 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec"} err="failed to get container status \"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec\": rpc error: code = NotFound desc = could not find container \"a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec\": container with ID starting with a27df04a0381f4e82a650eddddd1d51bfcb4886acc5b6d801c9c9062b58f39ec not found: ID does not exist" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.166732 4778 scope.go:117] "RemoveContainer" containerID="ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c" Mar 18 09:17:25 crc kubenswrapper[4778]: E0318 09:17:25.167137 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c\": container with ID starting with ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c not found: ID does not exist" containerID="ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.167188 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c"} err="failed to get container status \"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c\": rpc error: code = NotFound desc = could not find container \"ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c\": container with ID starting with ced0fd002c04979367e8f1e6bf5be732351bdda20cc30c52c5172e5dabdeae8c not found: ID does not exist" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.183255 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" (UID: "dc0f993f-9cd4-479c-b61f-e2eed4a69c2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.215711 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.430959 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:25 crc kubenswrapper[4778]: I0318 09:17:25.438418 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sxg2m"] Mar 18 09:17:26 crc kubenswrapper[4778]: I0318 09:17:26.199880 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" path="/var/lib/kubelet/pods/dc0f993f-9cd4-479c-b61f-e2eed4a69c2d/volumes" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.052836 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-pgsqh" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" containerID="cri-o://f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0" gracePeriod=15 Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.479380 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pgsqh_5f875d21-ddf2-4d41-8be3-819c8836824a/console/0.log" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.479705 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557056 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557135 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557189 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7qw9\" (UniqueName: \"kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557228 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557248 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557272 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.557324 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config\") pod \"5f875d21-ddf2-4d41-8be3-819c8836824a\" (UID: \"5f875d21-ddf2-4d41-8be3-819c8836824a\") " Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.558145 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.558438 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config" (OuterVolumeSpecName: "console-config") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.558496 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca" (OuterVolumeSpecName: "service-ca") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.559168 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.566700 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.567421 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9" (OuterVolumeSpecName: "kube-api-access-z7qw9") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "kube-api-access-z7qw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.567706 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5f875d21-ddf2-4d41-8be3-819c8836824a" (UID: "5f875d21-ddf2-4d41-8be3-819c8836824a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658447 4778 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658484 4778 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658497 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7qw9\" (UniqueName: \"kubernetes.io/projected/5f875d21-ddf2-4d41-8be3-819c8836824a-kube-api-access-z7qw9\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658511 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658523 4778 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658534 4778 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f875d21-ddf2-4d41-8be3-819c8836824a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:32 crc kubenswrapper[4778]: I0318 09:17:32.658546 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f875d21-ddf2-4d41-8be3-819c8836824a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131804 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pgsqh_5f875d21-ddf2-4d41-8be3-819c8836824a/console/0.log" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131874 4778 generic.go:334] "Generic (PLEG): container finished" podID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerID="f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0" exitCode=2 Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pgsqh" event={"ID":"5f875d21-ddf2-4d41-8be3-819c8836824a","Type":"ContainerDied","Data":"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0"} Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131954 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pgsqh" event={"ID":"5f875d21-ddf2-4d41-8be3-819c8836824a","Type":"ContainerDied","Data":"0526269f3752c495953fd88d5da903a92103220f8039ec4c7dde34390b5f6401"} Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131961 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pgsqh" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.131989 4778 scope.go:117] "RemoveContainer" containerID="f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.170632 4778 scope.go:117] "RemoveContainer" containerID="f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.170834 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.171630 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0\": container with ID starting with f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0 not found: ID does not exist" containerID="f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.171711 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0"} err="failed to get container status \"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0\": rpc error: code = NotFound desc = could not find container \"f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0\": container with ID starting with f26ec07ea16e6b22e59d48eb7d138d577170cdc566099b6f47750b5be1b5b4c0 not found: ID does not exist" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.177822 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-pgsqh"] Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376090 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl"] Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376638 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="extract-utilities" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376651 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="extract-utilities" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376663 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="extract-content" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376669 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="extract-content" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376678 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="extract-content" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376684 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="extract-content" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376696 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376702 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376713 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="extract-utilities" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376719 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="extract-utilities" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376728 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376736 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" Mar 18 09:17:33 crc kubenswrapper[4778]: E0318 09:17:33.376745 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376751 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376838 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc0f993f-9cd4-479c-b61f-e2eed4a69c2d" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376848 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e437a76c-331a-422b-aafb-036febcf9e98" containerName="registry-server" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.376859 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" containerName="console" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.377605 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.380336 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.396732 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl"] Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.473564 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.473647 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.473677 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh6hm\" (UniqueName: \"kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.574707 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.574757 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh6hm\" (UniqueName: \"kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.574838 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.575374 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.575499 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.600422 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh6hm\" (UniqueName: \"kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.697314 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:33 crc kubenswrapper[4778]: I0318 09:17:33.935506 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl"] Mar 18 09:17:34 crc kubenswrapper[4778]: I0318 09:17:34.141445 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerStarted","Data":"ab0f6ddf348551a9eea8ce5154623b62b1e5f456495ce97d032488c506954ea0"} Mar 18 09:17:34 crc kubenswrapper[4778]: I0318 09:17:34.141494 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerStarted","Data":"4af378aa95ab5264d0f168d006956f0deb2cbc8707b67b859997751f0ca58b65"} Mar 18 09:17:34 crc kubenswrapper[4778]: I0318 09:17:34.196090 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f875d21-ddf2-4d41-8be3-819c8836824a" path="/var/lib/kubelet/pods/5f875d21-ddf2-4d41-8be3-819c8836824a/volumes" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.151539 4778 generic.go:334] "Generic (PLEG): container finished" podID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerID="ab0f6ddf348551a9eea8ce5154623b62b1e5f456495ce97d032488c506954ea0" exitCode=0 Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.151614 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerDied","Data":"ab0f6ddf348551a9eea8ce5154623b62b1e5f456495ce97d032488c506954ea0"} Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.490339 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.491582 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.506289 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.612758 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.612843 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.612887 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcmnt\" (UniqueName: \"kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.714335 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.714403 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.714434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcmnt\" (UniqueName: \"kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.714970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.715011 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.747229 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcmnt\" (UniqueName: \"kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt\") pod \"redhat-operators-hdvpq\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:35 crc kubenswrapper[4778]: I0318 09:17:35.859638 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:36 crc kubenswrapper[4778]: I0318 09:17:36.340294 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:37 crc kubenswrapper[4778]: I0318 09:17:37.167290 4778 generic.go:334] "Generic (PLEG): container finished" podID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerID="20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543" exitCode=0 Mar 18 09:17:37 crc kubenswrapper[4778]: I0318 09:17:37.167375 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerDied","Data":"20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543"} Mar 18 09:17:37 crc kubenswrapper[4778]: I0318 09:17:37.167646 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerStarted","Data":"a12438818ab44535475b9d63ec863fde1df42e8be31733d20b680eaccab2c7df"} Mar 18 09:17:37 crc kubenswrapper[4778]: I0318 09:17:37.173879 4778 generic.go:334] "Generic (PLEG): container finished" podID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerID="b5f3877e0e967c7533d9431944ceb301fe3aefbefe8a0404e2d89e1b25820483" exitCode=0 Mar 18 09:17:37 crc kubenswrapper[4778]: I0318 09:17:37.173910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerDied","Data":"b5f3877e0e967c7533d9431944ceb301fe3aefbefe8a0404e2d89e1b25820483"} Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.187580 4778 generic.go:334] "Generic (PLEG): container finished" podID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerID="a8efadf3c7201d7501041a28e32c15d23bcfe705d0dc7e9826ae666b7fcc554b" exitCode=0 Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.201617 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerDied","Data":"a8efadf3c7201d7501041a28e32c15d23bcfe705d0dc7e9826ae666b7fcc554b"} Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.704508 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.706903 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.717426 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.862778 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlzdz\" (UniqueName: \"kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.862855 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.862934 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.964435 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlzdz\" (UniqueName: \"kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.964508 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.964544 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.965121 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.965163 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:38 crc kubenswrapper[4778]: I0318 09:17:38.989082 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlzdz\" (UniqueName: \"kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz\") pod \"community-operators-m9755\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.028420 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.228301 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerStarted","Data":"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff"} Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.319090 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:39 crc kubenswrapper[4778]: W0318 09:17:39.345685 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c0788a7_7426_4545_8426_1170b75287d7.slice/crio-b4986a5669e363b7c8dbe637152b9e988c9b8b567af7a16fb425dd9d24fbd246 WatchSource:0}: Error finding container b4986a5669e363b7c8dbe637152b9e988c9b8b567af7a16fb425dd9d24fbd246: Status 404 returned error can't find the container with id b4986a5669e363b7c8dbe637152b9e988c9b8b567af7a16fb425dd9d24fbd246 Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.490575 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.575505 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle\") pod \"2416fdd2-138d-4320-8ff6-47f621e093a9\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.575566 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util\") pod \"2416fdd2-138d-4320-8ff6-47f621e093a9\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.575624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh6hm\" (UniqueName: \"kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm\") pod \"2416fdd2-138d-4320-8ff6-47f621e093a9\" (UID: \"2416fdd2-138d-4320-8ff6-47f621e093a9\") " Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.576623 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle" (OuterVolumeSpecName: "bundle") pod "2416fdd2-138d-4320-8ff6-47f621e093a9" (UID: "2416fdd2-138d-4320-8ff6-47f621e093a9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.581777 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm" (OuterVolumeSpecName: "kube-api-access-hh6hm") pod "2416fdd2-138d-4320-8ff6-47f621e093a9" (UID: "2416fdd2-138d-4320-8ff6-47f621e093a9"). InnerVolumeSpecName "kube-api-access-hh6hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.677312 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.677373 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh6hm\" (UniqueName: \"kubernetes.io/projected/2416fdd2-138d-4320-8ff6-47f621e093a9-kube-api-access-hh6hm\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.847308 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util" (OuterVolumeSpecName: "util") pod "2416fdd2-138d-4320-8ff6-47f621e093a9" (UID: "2416fdd2-138d-4320-8ff6-47f621e093a9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:39 crc kubenswrapper[4778]: I0318 09:17:39.880964 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2416fdd2-138d-4320-8ff6-47f621e093a9-util\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.236866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" event={"ID":"2416fdd2-138d-4320-8ff6-47f621e093a9","Type":"ContainerDied","Data":"4af378aa95ab5264d0f168d006956f0deb2cbc8707b67b859997751f0ca58b65"} Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.236933 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4af378aa95ab5264d0f168d006956f0deb2cbc8707b67b859997751f0ca58b65" Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.237046 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl" Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.239466 4778 generic.go:334] "Generic (PLEG): container finished" podID="5c0788a7-7426-4545-8426-1170b75287d7" containerID="92fe5901b0aca7f6a3ebd767beba9a53c2021f0beb5e9b587ad8a027aeada4c3" exitCode=0 Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.239542 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerDied","Data":"92fe5901b0aca7f6a3ebd767beba9a53c2021f0beb5e9b587ad8a027aeada4c3"} Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.239566 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerStarted","Data":"b4986a5669e363b7c8dbe637152b9e988c9b8b567af7a16fb425dd9d24fbd246"} Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.243847 4778 generic.go:334] "Generic (PLEG): container finished" podID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerID="6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff" exitCode=0 Mar 18 09:17:40 crc kubenswrapper[4778]: I0318 09:17:40.243890 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerDied","Data":"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff"} Mar 18 09:17:41 crc kubenswrapper[4778]: I0318 09:17:41.255385 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerStarted","Data":"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e"} Mar 18 09:17:41 crc kubenswrapper[4778]: I0318 09:17:41.283651 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hdvpq" podStartSLOduration=2.68115167 podStartE2EDuration="6.283621536s" podCreationTimestamp="2026-03-18 09:17:35 +0000 UTC" firstStartedPulling="2026-03-18 09:17:37.16851597 +0000 UTC m=+923.743260800" lastFinishedPulling="2026-03-18 09:17:40.770985786 +0000 UTC m=+927.345730666" observedRunningTime="2026-03-18 09:17:41.280391828 +0000 UTC m=+927.855136668" watchObservedRunningTime="2026-03-18 09:17:41.283621536 +0000 UTC m=+927.858366416" Mar 18 09:17:43 crc kubenswrapper[4778]: I0318 09:17:43.270803 4778 generic.go:334] "Generic (PLEG): container finished" podID="5c0788a7-7426-4545-8426-1170b75287d7" containerID="f08dc64e887f7d353c1fc7e71e4a22ef60d30f4e79ac6175a3b59c6d9cbf86f1" exitCode=0 Mar 18 09:17:43 crc kubenswrapper[4778]: I0318 09:17:43.272094 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerDied","Data":"f08dc64e887f7d353c1fc7e71e4a22ef60d30f4e79ac6175a3b59c6d9cbf86f1"} Mar 18 09:17:44 crc kubenswrapper[4778]: I0318 09:17:44.281217 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerStarted","Data":"6a2f0efcf9f152b1a4865f9e79cc97bc201491960eaf4a19d83789511f3642f9"} Mar 18 09:17:44 crc kubenswrapper[4778]: I0318 09:17:44.321637 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m9755" podStartSLOduration=2.905558879 podStartE2EDuration="6.321616533s" podCreationTimestamp="2026-03-18 09:17:38 +0000 UTC" firstStartedPulling="2026-03-18 09:17:40.241801753 +0000 UTC m=+926.816546593" lastFinishedPulling="2026-03-18 09:17:43.657859407 +0000 UTC m=+930.232604247" observedRunningTime="2026-03-18 09:17:44.317810629 +0000 UTC m=+930.892555459" watchObservedRunningTime="2026-03-18 09:17:44.321616533 +0000 UTC m=+930.896361373" Mar 18 09:17:45 crc kubenswrapper[4778]: I0318 09:17:45.860658 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:45 crc kubenswrapper[4778]: I0318 09:17:45.860718 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:46 crc kubenswrapper[4778]: I0318 09:17:46.913732 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hdvpq" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="registry-server" probeResult="failure" output=< Mar 18 09:17:46 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:17:46 crc kubenswrapper[4778]: > Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.702536 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx"] Mar 18 09:17:47 crc kubenswrapper[4778]: E0318 09:17:47.702836 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="pull" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.702861 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="pull" Mar 18 09:17:47 crc kubenswrapper[4778]: E0318 09:17:47.702897 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="util" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.702908 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="util" Mar 18 09:17:47 crc kubenswrapper[4778]: E0318 09:17:47.702933 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="extract" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.702943 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="extract" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.703072 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2416fdd2-138d-4320-8ff6-47f621e093a9" containerName="extract" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.703604 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.707495 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.707585 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7w868" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.707654 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.707495 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.707867 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.716253 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx"] Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.790954 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-apiservice-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.791037 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-webhook-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.791062 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf848\" (UniqueName: \"kubernetes.io/projected/721ee07f-fded-43ab-9bb7-2e4e56c98515-kube-api-access-zf848\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.892658 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-apiservice-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.892754 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-webhook-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.892786 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf848\" (UniqueName: \"kubernetes.io/projected/721ee07f-fded-43ab-9bb7-2e4e56c98515-kube-api-access-zf848\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.901304 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-apiservice-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.904957 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/721ee07f-fded-43ab-9bb7-2e4e56c98515-webhook-cert\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.909163 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf848\" (UniqueName: \"kubernetes.io/projected/721ee07f-fded-43ab-9bb7-2e4e56c98515-kube-api-access-zf848\") pod \"metallb-operator-controller-manager-78856dcdc4-9cltx\" (UID: \"721ee07f-fded-43ab-9bb7-2e4e56c98515\") " pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.975109 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr"] Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.976605 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.979581 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.979853 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7897r" Mar 18 09:17:47 crc kubenswrapper[4778]: I0318 09:17:47.980692 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.004232 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr"] Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.022045 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.095323 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bggwp\" (UniqueName: \"kubernetes.io/projected/75885bb8-adce-4801-8941-75042ab330ea-kube-api-access-bggwp\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.095414 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-webhook-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.095452 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-apiservice-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.196740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bggwp\" (UniqueName: \"kubernetes.io/projected/75885bb8-adce-4801-8941-75042ab330ea-kube-api-access-bggwp\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.196810 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-webhook-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.196841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-apiservice-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.204271 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-apiservice-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.206740 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75885bb8-adce-4801-8941-75042ab330ea-webhook-cert\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.218555 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bggwp\" (UniqueName: \"kubernetes.io/projected/75885bb8-adce-4801-8941-75042ab330ea-kube-api-access-bggwp\") pod \"metallb-operator-webhook-server-5b499db45c-c5tcr\" (UID: \"75885bb8-adce-4801-8941-75042ab330ea\") " pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.293264 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.353650 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx"] Mar 18 09:17:48 crc kubenswrapper[4778]: I0318 09:17:48.748404 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr"] Mar 18 09:17:48 crc kubenswrapper[4778]: W0318 09:17:48.756943 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75885bb8_adce_4801_8941_75042ab330ea.slice/crio-57866cc4dc84d8985bce93348bd1a0f9aa43fcb7e969b623cb71f83e2013cab8 WatchSource:0}: Error finding container 57866cc4dc84d8985bce93348bd1a0f9aa43fcb7e969b623cb71f83e2013cab8: Status 404 returned error can't find the container with id 57866cc4dc84d8985bce93348bd1a0f9aa43fcb7e969b623cb71f83e2013cab8 Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.028777 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.028823 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.083125 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.315612 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" event={"ID":"721ee07f-fded-43ab-9bb7-2e4e56c98515","Type":"ContainerStarted","Data":"dd18d307601bf07f91a1512d1b4c89016803e2fcd9a0ede6a6fbd3ecd038c2b4"} Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.316758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" event={"ID":"75885bb8-adce-4801-8941-75042ab330ea","Type":"ContainerStarted","Data":"57866cc4dc84d8985bce93348bd1a0f9aa43fcb7e969b623cb71f83e2013cab8"} Mar 18 09:17:49 crc kubenswrapper[4778]: I0318 09:17:49.377807 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:51 crc kubenswrapper[4778]: I0318 09:17:51.678208 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:51 crc kubenswrapper[4778]: I0318 09:17:51.680068 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m9755" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="registry-server" containerID="cri-o://6a2f0efcf9f152b1a4865f9e79cc97bc201491960eaf4a19d83789511f3642f9" gracePeriod=2 Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.358478 4778 generic.go:334] "Generic (PLEG): container finished" podID="5c0788a7-7426-4545-8426-1170b75287d7" containerID="6a2f0efcf9f152b1a4865f9e79cc97bc201491960eaf4a19d83789511f3642f9" exitCode=0 Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.358697 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerDied","Data":"6a2f0efcf9f152b1a4865f9e79cc97bc201491960eaf4a19d83789511f3642f9"} Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.502641 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.570079 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlzdz\" (UniqueName: \"kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz\") pod \"5c0788a7-7426-4545-8426-1170b75287d7\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.570142 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content\") pod \"5c0788a7-7426-4545-8426-1170b75287d7\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.570224 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities\") pod \"5c0788a7-7426-4545-8426-1170b75287d7\" (UID: \"5c0788a7-7426-4545-8426-1170b75287d7\") " Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.571126 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities" (OuterVolumeSpecName: "utilities") pod "5c0788a7-7426-4545-8426-1170b75287d7" (UID: "5c0788a7-7426-4545-8426-1170b75287d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.579853 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz" (OuterVolumeSpecName: "kube-api-access-wlzdz") pod "5c0788a7-7426-4545-8426-1170b75287d7" (UID: "5c0788a7-7426-4545-8426-1170b75287d7"). InnerVolumeSpecName "kube-api-access-wlzdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.627551 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c0788a7-7426-4545-8426-1170b75287d7" (UID: "5c0788a7-7426-4545-8426-1170b75287d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.672337 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.672408 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlzdz\" (UniqueName: \"kubernetes.io/projected/5c0788a7-7426-4545-8426-1170b75287d7-kube-api-access-wlzdz\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:52 crc kubenswrapper[4778]: I0318 09:17:52.672424 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c0788a7-7426-4545-8426-1170b75287d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.371585 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9755" event={"ID":"5c0788a7-7426-4545-8426-1170b75287d7","Type":"ContainerDied","Data":"b4986a5669e363b7c8dbe637152b9e988c9b8b567af7a16fb425dd9d24fbd246"} Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.371657 4778 scope.go:117] "RemoveContainer" containerID="6a2f0efcf9f152b1a4865f9e79cc97bc201491960eaf4a19d83789511f3642f9" Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.371682 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9755" Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.375295 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" event={"ID":"721ee07f-fded-43ab-9bb7-2e4e56c98515","Type":"ContainerStarted","Data":"537efb928acc7327e2c8e054af18ecc834e18d233ce68ca00d26a2794beeb9b5"} Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.375623 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.398257 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" podStartSLOduration=2.451582765 podStartE2EDuration="6.39824016s" podCreationTimestamp="2026-03-18 09:17:47 +0000 UTC" firstStartedPulling="2026-03-18 09:17:48.349259555 +0000 UTC m=+934.924004395" lastFinishedPulling="2026-03-18 09:17:52.29591693 +0000 UTC m=+938.870661790" observedRunningTime="2026-03-18 09:17:53.395558017 +0000 UTC m=+939.970302867" watchObservedRunningTime="2026-03-18 09:17:53.39824016 +0000 UTC m=+939.972985000" Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.411604 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:53 crc kubenswrapper[4778]: I0318 09:17:53.415275 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m9755"] Mar 18 09:17:54 crc kubenswrapper[4778]: I0318 09:17:54.197314 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c0788a7-7426-4545-8426-1170b75287d7" path="/var/lib/kubelet/pods/5c0788a7-7426-4545-8426-1170b75287d7/volumes" Mar 18 09:17:54 crc kubenswrapper[4778]: I0318 09:17:54.360424 4778 scope.go:117] "RemoveContainer" containerID="f08dc64e887f7d353c1fc7e71e4a22ef60d30f4e79ac6175a3b59c6d9cbf86f1" Mar 18 09:17:54 crc kubenswrapper[4778]: I0318 09:17:54.422232 4778 scope.go:117] "RemoveContainer" containerID="92fe5901b0aca7f6a3ebd767beba9a53c2021f0beb5e9b587ad8a027aeada4c3" Mar 18 09:17:55 crc kubenswrapper[4778]: I0318 09:17:55.395020 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" event={"ID":"75885bb8-adce-4801-8941-75042ab330ea","Type":"ContainerStarted","Data":"b0b09fbb86fb64d07e560329726f55daf10a4187b5d9f1ddb7228a41c2522d49"} Mar 18 09:17:55 crc kubenswrapper[4778]: I0318 09:17:55.395738 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:17:55 crc kubenswrapper[4778]: I0318 09:17:55.947467 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:55 crc kubenswrapper[4778]: I0318 09:17:55.974629 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" podStartSLOduration=3.271614058 podStartE2EDuration="8.974594102s" podCreationTimestamp="2026-03-18 09:17:47 +0000 UTC" firstStartedPulling="2026-03-18 09:17:48.759630037 +0000 UTC m=+935.334374877" lastFinishedPulling="2026-03-18 09:17:54.462610081 +0000 UTC m=+941.037354921" observedRunningTime="2026-03-18 09:17:55.423486419 +0000 UTC m=+941.998231269" watchObservedRunningTime="2026-03-18 09:17:55.974594102 +0000 UTC m=+942.549338962" Mar 18 09:17:55 crc kubenswrapper[4778]: I0318 09:17:55.999347 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.077442 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.409600 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hdvpq" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="registry-server" containerID="cri-o://8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e" gracePeriod=2 Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.787340 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.854769 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities\") pod \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.854953 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content\") pod \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.855029 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcmnt\" (UniqueName: \"kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt\") pod \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\" (UID: \"8418cdea-ff67-4e52-acf8-39176b7f0cb6\") " Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.856239 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities" (OuterVolumeSpecName: "utilities") pod "8418cdea-ff67-4e52-acf8-39176b7f0cb6" (UID: "8418cdea-ff67-4e52-acf8-39176b7f0cb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.863177 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt" (OuterVolumeSpecName: "kube-api-access-dcmnt") pod "8418cdea-ff67-4e52-acf8-39176b7f0cb6" (UID: "8418cdea-ff67-4e52-acf8-39176b7f0cb6"). InnerVolumeSpecName "kube-api-access-dcmnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.957069 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:57 crc kubenswrapper[4778]: I0318 09:17:57.957119 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcmnt\" (UniqueName: \"kubernetes.io/projected/8418cdea-ff67-4e52-acf8-39176b7f0cb6-kube-api-access-dcmnt\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.020605 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8418cdea-ff67-4e52-acf8-39176b7f0cb6" (UID: "8418cdea-ff67-4e52-acf8-39176b7f0cb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.058757 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8418cdea-ff67-4e52-acf8-39176b7f0cb6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.421479 4778 generic.go:334] "Generic (PLEG): container finished" podID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerID="8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e" exitCode=0 Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.421529 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerDied","Data":"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e"} Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.421570 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdvpq" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.421593 4778 scope.go:117] "RemoveContainer" containerID="8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.421573 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdvpq" event={"ID":"8418cdea-ff67-4e52-acf8-39176b7f0cb6","Type":"ContainerDied","Data":"a12438818ab44535475b9d63ec863fde1df42e8be31733d20b680eaccab2c7df"} Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.451379 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.454045 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hdvpq"] Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.457187 4778 scope.go:117] "RemoveContainer" containerID="6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.482352 4778 scope.go:117] "RemoveContainer" containerID="20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.516954 4778 scope.go:117] "RemoveContainer" containerID="8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e" Mar 18 09:17:58 crc kubenswrapper[4778]: E0318 09:17:58.518643 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e\": container with ID starting with 8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e not found: ID does not exist" containerID="8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.518735 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e"} err="failed to get container status \"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e\": rpc error: code = NotFound desc = could not find container \"8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e\": container with ID starting with 8185b29933802833f1dbd141f474fc269638d436ac3edfee268129ce6326e60e not found: ID does not exist" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.518930 4778 scope.go:117] "RemoveContainer" containerID="6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff" Mar 18 09:17:58 crc kubenswrapper[4778]: E0318 09:17:58.519640 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff\": container with ID starting with 6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff not found: ID does not exist" containerID="6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.519688 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff"} err="failed to get container status \"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff\": rpc error: code = NotFound desc = could not find container \"6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff\": container with ID starting with 6c18cf54603d462d33eb7e65a8a9ab62b57c04dac7a3bdfc08b55318808fd8ff not found: ID does not exist" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.519716 4778 scope.go:117] "RemoveContainer" containerID="20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543" Mar 18 09:17:58 crc kubenswrapper[4778]: E0318 09:17:58.520036 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543\": container with ID starting with 20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543 not found: ID does not exist" containerID="20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543" Mar 18 09:17:58 crc kubenswrapper[4778]: I0318 09:17:58.520055 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543"} err="failed to get container status \"20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543\": rpc error: code = NotFound desc = could not find container \"20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543\": container with ID starting with 20a8b537e7dac12b21fa49a35154f0d7c9b7d684dfba9b6c632fec89fada6543 not found: ID does not exist" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.144831 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563758-zslfz"] Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145801 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="extract-utilities" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145822 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="extract-utilities" Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145842 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="extract-content" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145851 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="extract-content" Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145868 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145877 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145899 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="extract-utilities" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145907 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="extract-utilities" Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145926 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145935 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: E0318 09:18:00.145945 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="extract-content" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.145953 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="extract-content" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.146090 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0788a7-7426-4545-8426-1170b75287d7" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.146104 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.146725 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.148550 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.152874 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.155607 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.161689 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-zslfz"] Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.186529 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtggs\" (UniqueName: \"kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs\") pod \"auto-csr-approver-29563758-zslfz\" (UID: \"c70faab0-9f07-4452-a873-bcb59d28b7a8\") " pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.196115 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8418cdea-ff67-4e52-acf8-39176b7f0cb6" path="/var/lib/kubelet/pods/8418cdea-ff67-4e52-acf8-39176b7f0cb6/volumes" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.288514 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtggs\" (UniqueName: \"kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs\") pod \"auto-csr-approver-29563758-zslfz\" (UID: \"c70faab0-9f07-4452-a873-bcb59d28b7a8\") " pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.314532 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtggs\" (UniqueName: \"kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs\") pod \"auto-csr-approver-29563758-zslfz\" (UID: \"c70faab0-9f07-4452-a873-bcb59d28b7a8\") " pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.468683 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:00 crc kubenswrapper[4778]: I0318 09:18:00.819917 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-zslfz"] Mar 18 09:18:01 crc kubenswrapper[4778]: I0318 09:18:01.446415 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563758-zslfz" event={"ID":"c70faab0-9f07-4452-a873-bcb59d28b7a8","Type":"ContainerStarted","Data":"43a583205e8901d1305cfd0347ba2c715870319698b6b5c1fcb4dff75dd395ea"} Mar 18 09:18:02 crc kubenswrapper[4778]: I0318 09:18:02.454431 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563758-zslfz" event={"ID":"c70faab0-9f07-4452-a873-bcb59d28b7a8","Type":"ContainerStarted","Data":"392df7bab826882632f84664710c42da5399ea661a1dbfcd15aec0ad5d248553"} Mar 18 09:18:02 crc kubenswrapper[4778]: I0318 09:18:02.477940 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563758-zslfz" podStartSLOduration=1.406547748 podStartE2EDuration="2.47791519s" podCreationTimestamp="2026-03-18 09:18:00 +0000 UTC" firstStartedPulling="2026-03-18 09:18:00.826367699 +0000 UTC m=+947.401112539" lastFinishedPulling="2026-03-18 09:18:01.897735131 +0000 UTC m=+948.472479981" observedRunningTime="2026-03-18 09:18:02.471600898 +0000 UTC m=+949.046345748" watchObservedRunningTime="2026-03-18 09:18:02.47791519 +0000 UTC m=+949.052660020" Mar 18 09:18:03 crc kubenswrapper[4778]: I0318 09:18:03.463710 4778 generic.go:334] "Generic (PLEG): container finished" podID="c70faab0-9f07-4452-a873-bcb59d28b7a8" containerID="392df7bab826882632f84664710c42da5399ea661a1dbfcd15aec0ad5d248553" exitCode=0 Mar 18 09:18:03 crc kubenswrapper[4778]: I0318 09:18:03.463840 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563758-zslfz" event={"ID":"c70faab0-9f07-4452-a873-bcb59d28b7a8","Type":"ContainerDied","Data":"392df7bab826882632f84664710c42da5399ea661a1dbfcd15aec0ad5d248553"} Mar 18 09:18:04 crc kubenswrapper[4778]: I0318 09:18:04.764620 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:04 crc kubenswrapper[4778]: I0318 09:18:04.851083 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtggs\" (UniqueName: \"kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs\") pod \"c70faab0-9f07-4452-a873-bcb59d28b7a8\" (UID: \"c70faab0-9f07-4452-a873-bcb59d28b7a8\") " Mar 18 09:18:04 crc kubenswrapper[4778]: I0318 09:18:04.875480 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs" (OuterVolumeSpecName: "kube-api-access-jtggs") pod "c70faab0-9f07-4452-a873-bcb59d28b7a8" (UID: "c70faab0-9f07-4452-a873-bcb59d28b7a8"). InnerVolumeSpecName "kube-api-access-jtggs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:18:04 crc kubenswrapper[4778]: I0318 09:18:04.953102 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtggs\" (UniqueName: \"kubernetes.io/projected/c70faab0-9f07-4452-a873-bcb59d28b7a8-kube-api-access-jtggs\") on node \"crc\" DevicePath \"\"" Mar 18 09:18:05 crc kubenswrapper[4778]: I0318 09:18:05.478531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563758-zslfz" event={"ID":"c70faab0-9f07-4452-a873-bcb59d28b7a8","Type":"ContainerDied","Data":"43a583205e8901d1305cfd0347ba2c715870319698b6b5c1fcb4dff75dd395ea"} Mar 18 09:18:05 crc kubenswrapper[4778]: I0318 09:18:05.478575 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43a583205e8901d1305cfd0347ba2c715870319698b6b5c1fcb4dff75dd395ea" Mar 18 09:18:05 crc kubenswrapper[4778]: I0318 09:18:05.478629 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-zslfz" Mar 18 09:18:05 crc kubenswrapper[4778]: I0318 09:18:05.528672 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-h72kq"] Mar 18 09:18:05 crc kubenswrapper[4778]: I0318 09:18:05.531961 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-h72kq"] Mar 18 09:18:06 crc kubenswrapper[4778]: I0318 09:18:06.197268 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e614a6-a447-41bc-b7c8-034610af7d59" path="/var/lib/kubelet/pods/57e614a6-a447-41bc-b7c8-034610af7d59/volumes" Mar 18 09:18:08 crc kubenswrapper[4778]: I0318 09:18:08.299618 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5b499db45c-c5tcr" Mar 18 09:18:19 crc kubenswrapper[4778]: I0318 09:18:19.849317 4778 scope.go:117] "RemoveContainer" containerID="6614d11a5de4463d54d3a021b1144b715f14eddffa1ef95f83bb20fa8f58ca90" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.025846 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-78856dcdc4-9cltx" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.880266 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-g2q8m"] Mar 18 09:18:28 crc kubenswrapper[4778]: E0318 09:18:28.880551 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70faab0-9f07-4452-a873-bcb59d28b7a8" containerName="oc" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.880570 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70faab0-9f07-4452-a873-bcb59d28b7a8" containerName="oc" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.880743 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70faab0-9f07-4452-a873-bcb59d28b7a8" containerName="oc" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.883474 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.886607 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.886955 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.887080 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-25ntf" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.888621 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv"] Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.889489 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.895392 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.900948 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv"] Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.992280 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wd69x"] Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.993184 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wd69x" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.997565 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.997681 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.997763 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-rs6jl" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.997824 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998453 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-reloader\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998500 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998518 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998545 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics-certs\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998575 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-sockets\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998590 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsdtt\" (UniqueName: \"kubernetes.io/projected/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-kube-api-access-zsdtt\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998617 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-conf\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998643 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-startup\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:28 crc kubenswrapper[4778]: I0318 09:18:28.998658 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8jr\" (UniqueName: \"kubernetes.io/projected/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-kube-api-access-7k8jr\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.027162 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-sv9kd"] Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.028005 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.033438 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.041014 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-sv9kd"] Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.099896 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metallb-excludel2\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.099973 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dn57\" (UniqueName: \"kubernetes.io/projected/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-kube-api-access-5dn57\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100000 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100022 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100245 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics-certs\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100290 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100387 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-sockets\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100405 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsdtt\" (UniqueName: \"kubernetes.io/projected/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-kube-api-access-zsdtt\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100500 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-conf\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100592 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100622 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xj2p\" (UniqueName: \"kubernetes.io/projected/1c97662e-d673-42c1-a6ad-75865ba2b8b6-kube-api-access-2xj2p\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100900 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-startup\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100924 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8jr\" (UniqueName: \"kubernetes.io/projected/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-kube-api-access-7k8jr\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.101716 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100859 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-conf\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.101378 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-sockets\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.101674 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-frr-startup\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.101819 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-reloader\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.101836 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-cert\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.102030 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-reloader\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.100704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.105466 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.118887 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-metrics-certs\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.122941 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsdtt\" (UniqueName: \"kubernetes.io/projected/5efed87b-ad9c-4703-b3c4-2d6ab8d0883b-kube-api-access-zsdtt\") pod \"frr-k8s-g2q8m\" (UID: \"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b\") " pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.130952 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8jr\" (UniqueName: \"kubernetes.io/projected/0f18e9f0-b3eb-440a-b035-ed8256df5ed9-kube-api-access-7k8jr\") pod \"frr-k8s-webhook-server-bcc4b6f68-jrtjv\" (UID: \"0f18e9f0-b3eb-440a-b035-ed8256df5ed9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203162 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xj2p\" (UniqueName: \"kubernetes.io/projected/1c97662e-d673-42c1-a6ad-75865ba2b8b6-kube-api-access-2xj2p\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203189 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203247 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-cert\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203273 4778 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203324 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist podName:1c97662e-d673-42c1-a6ad-75865ba2b8b6 nodeName:}" failed. No retries permitted until 2026-03-18 09:18:29.703309531 +0000 UTC m=+976.278054371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist") pod "speaker-wd69x" (UID: "1c97662e-d673-42c1-a6ad-75865ba2b8b6") : secret "metallb-memberlist" not found Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203415 4778 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203501 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs podName:1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c nodeName:}" failed. No retries permitted until 2026-03-18 09:18:29.703479235 +0000 UTC m=+976.278224155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs") pod "controller-7bb4cc7c98-sv9kd" (UID: "1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c") : secret "controller-certs-secret" not found Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203275 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metallb-excludel2\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203594 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dn57\" (UniqueName: \"kubernetes.io/projected/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-kube-api-access-5dn57\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.203668 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203879 4778 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.203918 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs podName:1c97662e-d673-42c1-a6ad-75865ba2b8b6 nodeName:}" failed. No retries permitted until 2026-03-18 09:18:29.703907647 +0000 UTC m=+976.278652497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs") pod "speaker-wd69x" (UID: "1c97662e-d673-42c1-a6ad-75865ba2b8b6") : secret "speaker-certs-secret" not found Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.204027 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metallb-excludel2\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.205459 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.217760 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-cert\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.221738 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xj2p\" (UniqueName: \"kubernetes.io/projected/1c97662e-d673-42c1-a6ad-75865ba2b8b6-kube-api-access-2xj2p\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.222337 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dn57\" (UniqueName: \"kubernetes.io/projected/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-kube-api-access-5dn57\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.264748 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.286921 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.669758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"1656963afb085398b95b20b2c8ad4561a38a374e4b575785195a5cb57a6b4b19"} Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.706913 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv"] Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.709966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.710009 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.710060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.710328 4778 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 09:18:29 crc kubenswrapper[4778]: E0318 09:18:29.710511 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist podName:1c97662e-d673-42c1-a6ad-75865ba2b8b6 nodeName:}" failed. No retries permitted until 2026-03-18 09:18:30.71049002 +0000 UTC m=+977.285234880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist") pod "speaker-wd69x" (UID: "1c97662e-d673-42c1-a6ad-75865ba2b8b6") : secret "metallb-memberlist" not found Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.716127 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-metrics-certs\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.716180 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c-metrics-certs\") pod \"controller-7bb4cc7c98-sv9kd\" (UID: \"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c\") " pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:29 crc kubenswrapper[4778]: W0318 09:18:29.717776 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f18e9f0_b3eb_440a_b035_ed8256df5ed9.slice/crio-fb0879a587988f0d70b160d9cd7a1d7dfe060ccfd641ea41bdaae0f43d8649fa WatchSource:0}: Error finding container fb0879a587988f0d70b160d9cd7a1d7dfe060ccfd641ea41bdaae0f43d8649fa: Status 404 returned error can't find the container with id fb0879a587988f0d70b160d9cd7a1d7dfe060ccfd641ea41bdaae0f43d8649fa Mar 18 09:18:29 crc kubenswrapper[4778]: I0318 09:18:29.943856 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.242171 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-sv9kd"] Mar 18 09:18:30 crc kubenswrapper[4778]: W0318 09:18:30.269728 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddcd9d2_a5d0_4773_93f5_8eb9c0fff72c.slice/crio-5083ad7f107da57ef1b66367d5702cd70cf0a3452dafb15d7b844ccbd61a363b WatchSource:0}: Error finding container 5083ad7f107da57ef1b66367d5702cd70cf0a3452dafb15d7b844ccbd61a363b: Status 404 returned error can't find the container with id 5083ad7f107da57ef1b66367d5702cd70cf0a3452dafb15d7b844ccbd61a363b Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.676011 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-sv9kd" event={"ID":"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c","Type":"ContainerStarted","Data":"e7e1976f9564b551d2caf4ad3d741907575b2639672bc6fb228e75a6af818ec4"} Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.676062 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-sv9kd" event={"ID":"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c","Type":"ContainerStarted","Data":"6272aa5dc6d96b92cfb8bbcc1298d1985e685e5cd8a966efb1b4fc4defb883ca"} Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.676076 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-sv9kd" event={"ID":"1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c","Type":"ContainerStarted","Data":"5083ad7f107da57ef1b66367d5702cd70cf0a3452dafb15d7b844ccbd61a363b"} Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.676836 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.677888 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" event={"ID":"0f18e9f0-b3eb-440a-b035-ed8256df5ed9","Type":"ContainerStarted","Data":"fb0879a587988f0d70b160d9cd7a1d7dfe060ccfd641ea41bdaae0f43d8649fa"} Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.696006 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-sv9kd" podStartSLOduration=1.6959874080000001 podStartE2EDuration="1.695987408s" podCreationTimestamp="2026-03-18 09:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:18:30.694666662 +0000 UTC m=+977.269411512" watchObservedRunningTime="2026-03-18 09:18:30.695987408 +0000 UTC m=+977.270732248" Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.723025 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.741595 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1c97662e-d673-42c1-a6ad-75865ba2b8b6-memberlist\") pod \"speaker-wd69x\" (UID: \"1c97662e-d673-42c1-a6ad-75865ba2b8b6\") " pod="metallb-system/speaker-wd69x" Mar 18 09:18:30 crc kubenswrapper[4778]: I0318 09:18:30.812992 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wd69x" Mar 18 09:18:31 crc kubenswrapper[4778]: I0318 09:18:31.694491 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wd69x" event={"ID":"1c97662e-d673-42c1-a6ad-75865ba2b8b6","Type":"ContainerStarted","Data":"32ec92ba8d5cf30c9d775a8c04df502ef02fc21a6beae70d88527f6e68468e89"} Mar 18 09:18:31 crc kubenswrapper[4778]: I0318 09:18:31.694562 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wd69x" event={"ID":"1c97662e-d673-42c1-a6ad-75865ba2b8b6","Type":"ContainerStarted","Data":"281c99793b07759f67e5e3a4adb10852e12a93e9ff5202a70926999ecff20cc6"} Mar 18 09:18:31 crc kubenswrapper[4778]: I0318 09:18:31.694573 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wd69x" event={"ID":"1c97662e-d673-42c1-a6ad-75865ba2b8b6","Type":"ContainerStarted","Data":"41bbda6b48a7fd0c4eaede34ee7f01b13522e0aa9640c24765b5205f525f20df"} Mar 18 09:18:31 crc kubenswrapper[4778]: I0318 09:18:31.694816 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wd69x" Mar 18 09:18:31 crc kubenswrapper[4778]: I0318 09:18:31.715758 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wd69x" podStartSLOduration=3.715735354 podStartE2EDuration="3.715735354s" podCreationTimestamp="2026-03-18 09:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:18:31.711884118 +0000 UTC m=+978.286628958" watchObservedRunningTime="2026-03-18 09:18:31.715735354 +0000 UTC m=+978.290480194" Mar 18 09:18:37 crc kubenswrapper[4778]: I0318 09:18:37.740882 4778 generic.go:334] "Generic (PLEG): container finished" podID="5efed87b-ad9c-4703-b3c4-2d6ab8d0883b" containerID="4028e6c25bc19e614400f65d49c2b1ededcd6352f700b801dc89da8753b3257e" exitCode=0 Mar 18 09:18:37 crc kubenswrapper[4778]: I0318 09:18:37.740979 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerDied","Data":"4028e6c25bc19e614400f65d49c2b1ededcd6352f700b801dc89da8753b3257e"} Mar 18 09:18:37 crc kubenswrapper[4778]: I0318 09:18:37.751004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" event={"ID":"0f18e9f0-b3eb-440a-b035-ed8256df5ed9","Type":"ContainerStarted","Data":"33d80fcbe768658eea05319a17b88b75262d3e15c089b226a8d01e293f508f6d"} Mar 18 09:18:37 crc kubenswrapper[4778]: I0318 09:18:37.751501 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:37 crc kubenswrapper[4778]: I0318 09:18:37.791433 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" podStartSLOduration=2.5618729609999997 podStartE2EDuration="9.791406685s" podCreationTimestamp="2026-03-18 09:18:28 +0000 UTC" firstStartedPulling="2026-03-18 09:18:29.719466565 +0000 UTC m=+976.294211415" lastFinishedPulling="2026-03-18 09:18:36.949000289 +0000 UTC m=+983.523745139" observedRunningTime="2026-03-18 09:18:37.786949273 +0000 UTC m=+984.361694153" watchObservedRunningTime="2026-03-18 09:18:37.791406685 +0000 UTC m=+984.366151525" Mar 18 09:18:38 crc kubenswrapper[4778]: I0318 09:18:38.761961 4778 generic.go:334] "Generic (PLEG): container finished" podID="5efed87b-ad9c-4703-b3c4-2d6ab8d0883b" containerID="884d2e9957c49e96ac45cd1886b7ca9e7fcaa2bd9fbd23ff2ecc70afad84b846" exitCode=0 Mar 18 09:18:38 crc kubenswrapper[4778]: I0318 09:18:38.762059 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerDied","Data":"884d2e9957c49e96ac45cd1886b7ca9e7fcaa2bd9fbd23ff2ecc70afad84b846"} Mar 18 09:18:39 crc kubenswrapper[4778]: I0318 09:18:39.772753 4778 generic.go:334] "Generic (PLEG): container finished" podID="5efed87b-ad9c-4703-b3c4-2d6ab8d0883b" containerID="c10bc7206eb6bd274631b482b8196498e8e6de8ffa09c8a6daef2871d80ae65f" exitCode=0 Mar 18 09:18:39 crc kubenswrapper[4778]: I0318 09:18:39.772817 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerDied","Data":"c10bc7206eb6bd274631b482b8196498e8e6de8ffa09c8a6daef2871d80ae65f"} Mar 18 09:18:40 crc kubenswrapper[4778]: I0318 09:18:40.788752 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"3a0199f34a6190d90fb2b310b4578905a6ed7c36423fc454e725ea8982a1a481"} Mar 18 09:18:40 crc kubenswrapper[4778]: I0318 09:18:40.789148 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"3a3d3ca8de6b6bd608a39bbc12994bd285b221f443de2025e2710e1cade996f4"} Mar 18 09:18:40 crc kubenswrapper[4778]: I0318 09:18:40.789160 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"bfcf62580963ed5c78918d581c18981bc84a2ee2d1212aa05aafe76a6db44187"} Mar 18 09:18:40 crc kubenswrapper[4778]: I0318 09:18:40.789171 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"e6717b05aba2987915fb79fbe80e7310de063f714f06090157916d206897bae9"} Mar 18 09:18:40 crc kubenswrapper[4778]: I0318 09:18:40.789180 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"2584a5734ad79e77a7315cd8fff0259394f0fa965c0a8f63ac4ec5e7c5c410a4"} Mar 18 09:18:41 crc kubenswrapper[4778]: I0318 09:18:41.803481 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g2q8m" event={"ID":"5efed87b-ad9c-4703-b3c4-2d6ab8d0883b","Type":"ContainerStarted","Data":"89df37fc150fde5e7b3406ece065011b4e08e3652093c7c2be1ec0b316ca5010"} Mar 18 09:18:41 crc kubenswrapper[4778]: I0318 09:18:41.803699 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:41 crc kubenswrapper[4778]: I0318 09:18:41.831302 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-g2q8m" podStartSLOduration=6.246856879 podStartE2EDuration="13.831276126s" podCreationTimestamp="2026-03-18 09:18:28 +0000 UTC" firstStartedPulling="2026-03-18 09:18:29.392878035 +0000 UTC m=+975.967622875" lastFinishedPulling="2026-03-18 09:18:36.977297272 +0000 UTC m=+983.552042122" observedRunningTime="2026-03-18 09:18:41.830806013 +0000 UTC m=+988.405550913" watchObservedRunningTime="2026-03-18 09:18:41.831276126 +0000 UTC m=+988.406021006" Mar 18 09:18:44 crc kubenswrapper[4778]: I0318 09:18:44.265854 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:44 crc kubenswrapper[4778]: I0318 09:18:44.336142 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:49 crc kubenswrapper[4778]: I0318 09:18:49.268887 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-g2q8m" Mar 18 09:18:49 crc kubenswrapper[4778]: I0318 09:18:49.291635 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jrtjv" Mar 18 09:18:49 crc kubenswrapper[4778]: I0318 09:18:49.949274 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-sv9kd" Mar 18 09:18:50 crc kubenswrapper[4778]: I0318 09:18:50.816426 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wd69x" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.681826 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.683547 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.688342 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-bj7kp" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.688864 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.689240 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.729980 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.817190 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rsch\" (UniqueName: \"kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch\") pod \"openstack-operator-index-wq58v\" (UID: \"af88af30-254e-4fc3-a29c-a27a6c5fc237\") " pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.919033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rsch\" (UniqueName: \"kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch\") pod \"openstack-operator-index-wq58v\" (UID: \"af88af30-254e-4fc3-a29c-a27a6c5fc237\") " pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:53 crc kubenswrapper[4778]: I0318 09:18:53.947477 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rsch\" (UniqueName: \"kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch\") pod \"openstack-operator-index-wq58v\" (UID: \"af88af30-254e-4fc3-a29c-a27a6c5fc237\") " pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:54 crc kubenswrapper[4778]: I0318 09:18:54.008322 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:54 crc kubenswrapper[4778]: I0318 09:18:54.441974 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:18:54 crc kubenswrapper[4778]: W0318 09:18:54.446576 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf88af30_254e_4fc3_a29c_a27a6c5fc237.slice/crio-00c9aec972ccc889c267a8674b92b068b7f18f6f77e302ded3904a531ffc6a65 WatchSource:0}: Error finding container 00c9aec972ccc889c267a8674b92b068b7f18f6f77e302ded3904a531ffc6a65: Status 404 returned error can't find the container with id 00c9aec972ccc889c267a8674b92b068b7f18f6f77e302ded3904a531ffc6a65 Mar 18 09:18:54 crc kubenswrapper[4778]: I0318 09:18:54.898615 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wq58v" event={"ID":"af88af30-254e-4fc3-a29c-a27a6c5fc237","Type":"ContainerStarted","Data":"00c9aec972ccc889c267a8674b92b068b7f18f6f77e302ded3904a531ffc6a65"} Mar 18 09:18:56 crc kubenswrapper[4778]: I0318 09:18:56.456324 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.064121 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-v7qxm"] Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.065169 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.079579 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v7qxm"] Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.175742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d28wd\" (UniqueName: \"kubernetes.io/projected/c508c810-232f-48c1-8d15-bbbb118d2948-kube-api-access-d28wd\") pod \"openstack-operator-index-v7qxm\" (UID: \"c508c810-232f-48c1-8d15-bbbb118d2948\") " pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.277592 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d28wd\" (UniqueName: \"kubernetes.io/projected/c508c810-232f-48c1-8d15-bbbb118d2948-kube-api-access-d28wd\") pod \"openstack-operator-index-v7qxm\" (UID: \"c508c810-232f-48c1-8d15-bbbb118d2948\") " pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.300417 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d28wd\" (UniqueName: \"kubernetes.io/projected/c508c810-232f-48c1-8d15-bbbb118d2948-kube-api-access-d28wd\") pod \"openstack-operator-index-v7qxm\" (UID: \"c508c810-232f-48c1-8d15-bbbb118d2948\") " pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.389910 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.849703 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v7qxm"] Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.920260 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v7qxm" event={"ID":"c508c810-232f-48c1-8d15-bbbb118d2948","Type":"ContainerStarted","Data":"fde3b5949c32154e6d1b4e565f49e7dea0a53d0b4d82148511eacb0596e02c86"} Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.923316 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wq58v" event={"ID":"af88af30-254e-4fc3-a29c-a27a6c5fc237","Type":"ContainerStarted","Data":"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0"} Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.923634 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wq58v" podUID="af88af30-254e-4fc3-a29c-a27a6c5fc237" containerName="registry-server" containerID="cri-o://ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0" gracePeriod=2 Mar 18 09:18:57 crc kubenswrapper[4778]: I0318 09:18:57.945017 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wq58v" podStartSLOduration=2.648930152 podStartE2EDuration="4.944991118s" podCreationTimestamp="2026-03-18 09:18:53 +0000 UTC" firstStartedPulling="2026-03-18 09:18:54.448361222 +0000 UTC m=+1001.023106062" lastFinishedPulling="2026-03-18 09:18:56.744422178 +0000 UTC m=+1003.319167028" observedRunningTime="2026-03-18 09:18:57.942403207 +0000 UTC m=+1004.517148047" watchObservedRunningTime="2026-03-18 09:18:57.944991118 +0000 UTC m=+1004.519735968" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.320758 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.397274 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rsch\" (UniqueName: \"kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch\") pod \"af88af30-254e-4fc3-a29c-a27a6c5fc237\" (UID: \"af88af30-254e-4fc3-a29c-a27a6c5fc237\") " Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.404106 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch" (OuterVolumeSpecName: "kube-api-access-2rsch") pod "af88af30-254e-4fc3-a29c-a27a6c5fc237" (UID: "af88af30-254e-4fc3-a29c-a27a6c5fc237"). InnerVolumeSpecName "kube-api-access-2rsch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.501363 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rsch\" (UniqueName: \"kubernetes.io/projected/af88af30-254e-4fc3-a29c-a27a6c5fc237-kube-api-access-2rsch\") on node \"crc\" DevicePath \"\"" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.932229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v7qxm" event={"ID":"c508c810-232f-48c1-8d15-bbbb118d2948","Type":"ContainerStarted","Data":"b07fb00176255682c76c04b0680ea7bbe25fa81d6391ca7ba238ec1a72cb8051"} Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.933911 4778 generic.go:334] "Generic (PLEG): container finished" podID="af88af30-254e-4fc3-a29c-a27a6c5fc237" containerID="ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0" exitCode=0 Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.933997 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wq58v" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.933993 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wq58v" event={"ID":"af88af30-254e-4fc3-a29c-a27a6c5fc237","Type":"ContainerDied","Data":"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0"} Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.934120 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wq58v" event={"ID":"af88af30-254e-4fc3-a29c-a27a6c5fc237","Type":"ContainerDied","Data":"00c9aec972ccc889c267a8674b92b068b7f18f6f77e302ded3904a531ffc6a65"} Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.934150 4778 scope.go:117] "RemoveContainer" containerID="ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.953633 4778 scope.go:117] "RemoveContainer" containerID="ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0" Mar 18 09:18:58 crc kubenswrapper[4778]: E0318 09:18:58.954173 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0\": container with ID starting with ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0 not found: ID does not exist" containerID="ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.954243 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0"} err="failed to get container status \"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0\": rpc error: code = NotFound desc = could not find container \"ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0\": container with ID starting with ae9056bca4b830275b3e82276253569d108e0e3659d27ddb30902c7fa1ceb0f0 not found: ID does not exist" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.961857 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-v7qxm" podStartSLOduration=1.905758721 podStartE2EDuration="1.961839554s" podCreationTimestamp="2026-03-18 09:18:57 +0000 UTC" firstStartedPulling="2026-03-18 09:18:57.868754394 +0000 UTC m=+1004.443499244" lastFinishedPulling="2026-03-18 09:18:57.924835237 +0000 UTC m=+1004.499580077" observedRunningTime="2026-03-18 09:18:58.957629118 +0000 UTC m=+1005.532373958" watchObservedRunningTime="2026-03-18 09:18:58.961839554 +0000 UTC m=+1005.536584394" Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.978413 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:18:58 crc kubenswrapper[4778]: I0318 09:18:58.982076 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wq58v"] Mar 18 09:19:00 crc kubenswrapper[4778]: I0318 09:19:00.147449 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:19:00 crc kubenswrapper[4778]: I0318 09:19:00.148032 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:19:00 crc kubenswrapper[4778]: I0318 09:19:00.197427 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af88af30-254e-4fc3-a29c-a27a6c5fc237" path="/var/lib/kubelet/pods/af88af30-254e-4fc3-a29c-a27a6c5fc237/volumes" Mar 18 09:19:07 crc kubenswrapper[4778]: I0318 09:19:07.390900 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:19:07 crc kubenswrapper[4778]: I0318 09:19:07.391859 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:19:07 crc kubenswrapper[4778]: I0318 09:19:07.483627 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:19:08 crc kubenswrapper[4778]: I0318 09:19:08.050983 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-v7qxm" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.634788 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb"] Mar 18 09:19:13 crc kubenswrapper[4778]: E0318 09:19:13.635650 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af88af30-254e-4fc3-a29c-a27a6c5fc237" containerName="registry-server" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.635669 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="af88af30-254e-4fc3-a29c-a27a6c5fc237" containerName="registry-server" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.635843 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="af88af30-254e-4fc3-a29c-a27a6c5fc237" containerName="registry-server" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.636857 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.639808 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-hqfxz" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.655005 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb"] Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.736599 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjnq5\" (UniqueName: \"kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.736684 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.736787 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.838412 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.838494 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjnq5\" (UniqueName: \"kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.838537 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.839354 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.839582 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.862362 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjnq5\" (UniqueName: \"kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5\") pod \"f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:13 crc kubenswrapper[4778]: I0318 09:19:13.955615 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:14 crc kubenswrapper[4778]: I0318 09:19:14.227932 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb"] Mar 18 09:19:15 crc kubenswrapper[4778]: I0318 09:19:15.061148 4778 generic.go:334] "Generic (PLEG): container finished" podID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerID="47836f9ce3326802fb91a0906c882b64a2ee8f615265698a7da9aa2e7af057a3" exitCode=0 Mar 18 09:19:15 crc kubenswrapper[4778]: I0318 09:19:15.061261 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" event={"ID":"bf055ff8-8bbd-4628-a5ad-c765775e8f16","Type":"ContainerDied","Data":"47836f9ce3326802fb91a0906c882b64a2ee8f615265698a7da9aa2e7af057a3"} Mar 18 09:19:15 crc kubenswrapper[4778]: I0318 09:19:15.061587 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" event={"ID":"bf055ff8-8bbd-4628-a5ad-c765775e8f16","Type":"ContainerStarted","Data":"8529cd59e47eae09ff5610bf217b94cdbf4dedff1d056cb9d0c8ab700e2b08b9"} Mar 18 09:19:16 crc kubenswrapper[4778]: I0318 09:19:16.072470 4778 generic.go:334] "Generic (PLEG): container finished" podID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerID="84728ae18eda0b7e5585f55947b66e6ed6fb4b9198db7f6cb581530b764fd135" exitCode=0 Mar 18 09:19:16 crc kubenswrapper[4778]: I0318 09:19:16.072528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" event={"ID":"bf055ff8-8bbd-4628-a5ad-c765775e8f16","Type":"ContainerDied","Data":"84728ae18eda0b7e5585f55947b66e6ed6fb4b9198db7f6cb581530b764fd135"} Mar 18 09:19:17 crc kubenswrapper[4778]: I0318 09:19:17.080737 4778 generic.go:334] "Generic (PLEG): container finished" podID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerID="5b08df49541db16461b6000e0757b929059d014d5332c54dd24d345ac2134d55" exitCode=0 Mar 18 09:19:17 crc kubenswrapper[4778]: I0318 09:19:17.080821 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" event={"ID":"bf055ff8-8bbd-4628-a5ad-c765775e8f16","Type":"ContainerDied","Data":"5b08df49541db16461b6000e0757b929059d014d5332c54dd24d345ac2134d55"} Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.400070 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.508410 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util\") pod \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.508777 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle\") pod \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.510083 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjnq5\" (UniqueName: \"kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5\") pod \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\" (UID: \"bf055ff8-8bbd-4628-a5ad-c765775e8f16\") " Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.510683 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle" (OuterVolumeSpecName: "bundle") pod "bf055ff8-8bbd-4628-a5ad-c765775e8f16" (UID: "bf055ff8-8bbd-4628-a5ad-c765775e8f16"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.518164 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5" (OuterVolumeSpecName: "kube-api-access-jjnq5") pod "bf055ff8-8bbd-4628-a5ad-c765775e8f16" (UID: "bf055ff8-8bbd-4628-a5ad-c765775e8f16"). InnerVolumeSpecName "kube-api-access-jjnq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.529598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util" (OuterVolumeSpecName: "util") pod "bf055ff8-8bbd-4628-a5ad-c765775e8f16" (UID: "bf055ff8-8bbd-4628-a5ad-c765775e8f16"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.611949 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-util\") on node \"crc\" DevicePath \"\"" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.612013 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf055ff8-8bbd-4628-a5ad-c765775e8f16-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:19:18 crc kubenswrapper[4778]: I0318 09:19:18.612028 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjnq5\" (UniqueName: \"kubernetes.io/projected/bf055ff8-8bbd-4628-a5ad-c765775e8f16-kube-api-access-jjnq5\") on node \"crc\" DevicePath \"\"" Mar 18 09:19:19 crc kubenswrapper[4778]: I0318 09:19:19.113721 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" event={"ID":"bf055ff8-8bbd-4628-a5ad-c765775e8f16","Type":"ContainerDied","Data":"8529cd59e47eae09ff5610bf217b94cdbf4dedff1d056cb9d0c8ab700e2b08b9"} Mar 18 09:19:19 crc kubenswrapper[4778]: I0318 09:19:19.113773 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8529cd59e47eae09ff5610bf217b94cdbf4dedff1d056cb9d0c8ab700e2b08b9" Mar 18 09:19:19 crc kubenswrapper[4778]: I0318 09:19:19.113857 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.860234 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb"] Mar 18 09:19:25 crc kubenswrapper[4778]: E0318 09:19:25.861418 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="pull" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.861438 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="pull" Mar 18 09:19:25 crc kubenswrapper[4778]: E0318 09:19:25.861465 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="util" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.861471 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="util" Mar 18 09:19:25 crc kubenswrapper[4778]: E0318 09:19:25.861477 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="extract" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.861484 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="extract" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.861601 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf055ff8-8bbd-4628-a5ad-c765775e8f16" containerName="extract" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.862152 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.872689 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rg2gx" Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.889955 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb"] Mar 18 09:19:25 crc kubenswrapper[4778]: I0318 09:19:25.915467 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvwhv\" (UniqueName: \"kubernetes.io/projected/b8267dff-2541-481e-bc64-13eb8d19300b-kube-api-access-dvwhv\") pod \"openstack-operator-controller-init-654f4fc7f7-9d4pb\" (UID: \"b8267dff-2541-481e-bc64-13eb8d19300b\") " pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:26 crc kubenswrapper[4778]: I0318 09:19:26.017609 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvwhv\" (UniqueName: \"kubernetes.io/projected/b8267dff-2541-481e-bc64-13eb8d19300b-kube-api-access-dvwhv\") pod \"openstack-operator-controller-init-654f4fc7f7-9d4pb\" (UID: \"b8267dff-2541-481e-bc64-13eb8d19300b\") " pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:26 crc kubenswrapper[4778]: I0318 09:19:26.044666 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvwhv\" (UniqueName: \"kubernetes.io/projected/b8267dff-2541-481e-bc64-13eb8d19300b-kube-api-access-dvwhv\") pod \"openstack-operator-controller-init-654f4fc7f7-9d4pb\" (UID: \"b8267dff-2541-481e-bc64-13eb8d19300b\") " pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:26 crc kubenswrapper[4778]: I0318 09:19:26.183917 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:26 crc kubenswrapper[4778]: I0318 09:19:26.617833 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb"] Mar 18 09:19:27 crc kubenswrapper[4778]: I0318 09:19:27.260266 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" event={"ID":"b8267dff-2541-481e-bc64-13eb8d19300b","Type":"ContainerStarted","Data":"a228eaf5ca5631c1c43cfb3184d487c9befddb1cb8e2cf870064bd86f470aabd"} Mar 18 09:19:30 crc kubenswrapper[4778]: I0318 09:19:30.147568 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:19:30 crc kubenswrapper[4778]: I0318 09:19:30.147956 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:19:32 crc kubenswrapper[4778]: I0318 09:19:32.292597 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" event={"ID":"b8267dff-2541-481e-bc64-13eb8d19300b","Type":"ContainerStarted","Data":"2c37a1d2401dad6db96eaba502ecbd7b30e7afca3591df1484a47815bf556097"} Mar 18 09:19:32 crc kubenswrapper[4778]: I0318 09:19:32.293011 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:19:32 crc kubenswrapper[4778]: I0318 09:19:32.344551 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" podStartSLOduration=1.952699237 podStartE2EDuration="7.344525008s" podCreationTimestamp="2026-03-18 09:19:25 +0000 UTC" firstStartedPulling="2026-03-18 09:19:26.626712694 +0000 UTC m=+1033.201457534" lastFinishedPulling="2026-03-18 09:19:32.018538465 +0000 UTC m=+1038.593283305" observedRunningTime="2026-03-18 09:19:32.334563426 +0000 UTC m=+1038.909308336" watchObservedRunningTime="2026-03-18 09:19:32.344525008 +0000 UTC m=+1038.919269878" Mar 18 09:19:46 crc kubenswrapper[4778]: I0318 09:19:46.197428 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-654f4fc7f7-9d4pb" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.146037 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563760-nvkp2"] Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.147912 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.147984 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.148056 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.148136 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.148949 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.149051 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38" gracePeriod=600 Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.154097 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.154577 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.154733 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.159099 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-nvkp2"] Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.187912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-766z8\" (UniqueName: \"kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8\") pod \"auto-csr-approver-29563760-nvkp2\" (UID: \"bc3bf93e-1b00-4852-b69b-0c8d701f56e3\") " pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.289459 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-766z8\" (UniqueName: \"kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8\") pod \"auto-csr-approver-29563760-nvkp2\" (UID: \"bc3bf93e-1b00-4852-b69b-0c8d701f56e3\") " pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.318864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-766z8\" (UniqueName: \"kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8\") pod \"auto-csr-approver-29563760-nvkp2\" (UID: \"bc3bf93e-1b00-4852-b69b-0c8d701f56e3\") " pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.468780 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.525987 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38" exitCode=0 Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.526067 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38"} Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.526339 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2"} Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.526361 4778 scope.go:117] "RemoveContainer" containerID="2bce801303fcc67febe135eadcc45500477ce5c50a4d9315c95b48aad6bc9b1d" Mar 18 09:20:00 crc kubenswrapper[4778]: I0318 09:20:00.965089 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-nvkp2"] Mar 18 09:20:01 crc kubenswrapper[4778]: I0318 09:20:01.534544 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" event={"ID":"bc3bf93e-1b00-4852-b69b-0c8d701f56e3","Type":"ContainerStarted","Data":"22f8d1716c9a755ffbb22364f776ec4335946e6e03b2e1fa7f170bfeb4ef8f31"} Mar 18 09:20:02 crc kubenswrapper[4778]: I0318 09:20:02.546253 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" event={"ID":"bc3bf93e-1b00-4852-b69b-0c8d701f56e3","Type":"ContainerStarted","Data":"3c3567d850d5fbfcade4077c9139b7f651174e9261a4f7a1ab2f40e22fce3000"} Mar 18 09:20:02 crc kubenswrapper[4778]: I0318 09:20:02.571763 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" podStartSLOduration=1.374263493 podStartE2EDuration="2.571710558s" podCreationTimestamp="2026-03-18 09:20:00 +0000 UTC" firstStartedPulling="2026-03-18 09:20:00.974938754 +0000 UTC m=+1067.549683594" lastFinishedPulling="2026-03-18 09:20:02.172385819 +0000 UTC m=+1068.747130659" observedRunningTime="2026-03-18 09:20:02.569963681 +0000 UTC m=+1069.144708521" watchObservedRunningTime="2026-03-18 09:20:02.571710558 +0000 UTC m=+1069.146455408" Mar 18 09:20:03 crc kubenswrapper[4778]: I0318 09:20:03.555457 4778 generic.go:334] "Generic (PLEG): container finished" podID="bc3bf93e-1b00-4852-b69b-0c8d701f56e3" containerID="3c3567d850d5fbfcade4077c9139b7f651174e9261a4f7a1ab2f40e22fce3000" exitCode=0 Mar 18 09:20:03 crc kubenswrapper[4778]: I0318 09:20:03.555506 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" event={"ID":"bc3bf93e-1b00-4852-b69b-0c8d701f56e3","Type":"ContainerDied","Data":"3c3567d850d5fbfcade4077c9139b7f651174e9261a4f7a1ab2f40e22fce3000"} Mar 18 09:20:04 crc kubenswrapper[4778]: I0318 09:20:04.924127 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.072665 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-766z8\" (UniqueName: \"kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8\") pod \"bc3bf93e-1b00-4852-b69b-0c8d701f56e3\" (UID: \"bc3bf93e-1b00-4852-b69b-0c8d701f56e3\") " Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.081518 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8" (OuterVolumeSpecName: "kube-api-access-766z8") pod "bc3bf93e-1b00-4852-b69b-0c8d701f56e3" (UID: "bc3bf93e-1b00-4852-b69b-0c8d701f56e3"). InnerVolumeSpecName "kube-api-access-766z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.174631 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-766z8\" (UniqueName: \"kubernetes.io/projected/bc3bf93e-1b00-4852-b69b-0c8d701f56e3-kube-api-access-766z8\") on node \"crc\" DevicePath \"\"" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.347174 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt"] Mar 18 09:20:05 crc kubenswrapper[4778]: E0318 09:20:05.347794 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3bf93e-1b00-4852-b69b-0c8d701f56e3" containerName="oc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.347811 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3bf93e-1b00-4852-b69b-0c8d701f56e3" containerName="oc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.347967 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc3bf93e-1b00-4852-b69b-0c8d701f56e3" containerName="oc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.348531 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.352026 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5zcs7" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.361038 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.390259 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.391258 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.395830 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-dp9kk" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.408739 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.409513 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.412475 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.416607 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-mm4tx" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.425458 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.434539 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.435665 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.436618 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.437167 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.445655 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8qxnf" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.445858 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-tngp4" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.455982 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.471238 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.472077 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.474836 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-x2sk4" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.479536 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.480547 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.484775 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.484965 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dpc7h" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486271 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq7zm\" (UniqueName: \"kubernetes.io/projected/124dc549-cb2a-4b1c-a610-093cf9b8c05d-kube-api-access-qq7zm\") pod \"horizon-operator-controller-manager-8464cc45fb-x7rnp\" (UID: \"124dc549-cb2a-4b1c-a610-093cf9b8c05d\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486540 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdfx\" (UniqueName: \"kubernetes.io/projected/aceb2f7b-585f-451a-83b8-e673965ada87-kube-api-access-thdfx\") pod \"heat-operator-controller-manager-67dd5f86f5-t5c4w\" (UID: \"aceb2f7b-585f-451a-83b8-e673965ada87\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486578 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vb8t\" (UniqueName: \"kubernetes.io/projected/0526f654-9ddc-4495-bb04-be13e53b6a1b-kube-api-access-6vb8t\") pod \"cinder-operator-controller-manager-8d58dc466-wxftc\" (UID: \"0526f654-9ddc-4495-bb04-be13e53b6a1b\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486609 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj4mr\" (UniqueName: \"kubernetes.io/projected/3390909b-6271-40dd-9662-0710f6866143-kube-api-access-hj4mr\") pod \"barbican-operator-controller-manager-59bc569d95-fsxlt\" (UID: \"3390909b-6271-40dd-9662-0710f6866143\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486629 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486683 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxwjf\" (UniqueName: \"kubernetes.io/projected/710ababb-0bee-441d-8dd0-e6a72ea2b2e3-kube-api-access-kxwjf\") pod \"designate-operator-controller-manager-588d4d986b-7mbx2\" (UID: \"710ababb-0bee-441d-8dd0-e6a72ea2b2e3\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.486925 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.498024 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.507061 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.513728 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.514463 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.525579 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.537104 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-6zddf" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.546323 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.547393 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.550881 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-wfl7r" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.566306 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.571334 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" event={"ID":"bc3bf93e-1b00-4852-b69b-0c8d701f56e3","Type":"ContainerDied","Data":"22f8d1716c9a755ffbb22364f776ec4335946e6e03b2e1fa7f170bfeb4ef8f31"} Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.571377 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22f8d1716c9a755ffbb22364f776ec4335946e6e03b2e1fa7f170bfeb4ef8f31" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.571422 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-nvkp2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594284 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdfx\" (UniqueName: \"kubernetes.io/projected/aceb2f7b-585f-451a-83b8-e673965ada87-kube-api-access-thdfx\") pod \"heat-operator-controller-manager-67dd5f86f5-t5c4w\" (UID: \"aceb2f7b-585f-451a-83b8-e673965ada87\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594357 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vb8t\" (UniqueName: \"kubernetes.io/projected/0526f654-9ddc-4495-bb04-be13e53b6a1b-kube-api-access-6vb8t\") pod \"cinder-operator-controller-manager-8d58dc466-wxftc\" (UID: \"0526f654-9ddc-4495-bb04-be13e53b6a1b\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594405 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwpvg\" (UniqueName: \"kubernetes.io/projected/b41dbd4a-33dd-4dca-9356-34c740e8063f-kube-api-access-jwpvg\") pod \"glance-operator-controller-manager-79df6bcc97-wb4pc\" (UID: \"b41dbd4a-33dd-4dca-9356-34c740e8063f\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594433 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj4mr\" (UniqueName: \"kubernetes.io/projected/3390909b-6271-40dd-9662-0710f6866143-kube-api-access-hj4mr\") pod \"barbican-operator-controller-manager-59bc569d95-fsxlt\" (UID: \"3390909b-6271-40dd-9662-0710f6866143\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594465 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594500 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxwjf\" (UniqueName: \"kubernetes.io/projected/710ababb-0bee-441d-8dd0-e6a72ea2b2e3-kube-api-access-kxwjf\") pod \"designate-operator-controller-manager-588d4d986b-7mbx2\" (UID: \"710ababb-0bee-441d-8dd0-e6a72ea2b2e3\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594593 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lkm5\" (UniqueName: \"kubernetes.io/projected/66d3bf3a-086c-4340-ba73-209f526fc33c-kube-api-access-5lkm5\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.594644 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq7zm\" (UniqueName: \"kubernetes.io/projected/124dc549-cb2a-4b1c-a610-093cf9b8c05d-kube-api-access-qq7zm\") pod \"horizon-operator-controller-manager-8464cc45fb-x7rnp\" (UID: \"124dc549-cb2a-4b1c-a610-093cf9b8c05d\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:05 crc kubenswrapper[4778]: E0318 09:20:05.595517 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:05 crc kubenswrapper[4778]: E0318 09:20:05.595563 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert podName:66d3bf3a-086c-4340-ba73-209f526fc33c nodeName:}" failed. No retries permitted until 2026-03-18 09:20:06.095547236 +0000 UTC m=+1072.670292076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert") pod "infra-operator-controller-manager-7b9c774f96-64c4x" (UID: "66d3bf3a-086c-4340-ba73-209f526fc33c") : secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.598955 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.603635 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.619247 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-qfjfv" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.619453 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zpc92"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.621111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.626353 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdfx\" (UniqueName: \"kubernetes.io/projected/aceb2f7b-585f-451a-83b8-e673965ada87-kube-api-access-thdfx\") pod \"heat-operator-controller-manager-67dd5f86f5-t5c4w\" (UID: \"aceb2f7b-585f-451a-83b8-e673965ada87\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.626538 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2xnxj" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.662077 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vb8t\" (UniqueName: \"kubernetes.io/projected/0526f654-9ddc-4495-bb04-be13e53b6a1b-kube-api-access-6vb8t\") pod \"cinder-operator-controller-manager-8d58dc466-wxftc\" (UID: \"0526f654-9ddc-4495-bb04-be13e53b6a1b\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.662901 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq7zm\" (UniqueName: \"kubernetes.io/projected/124dc549-cb2a-4b1c-a610-093cf9b8c05d-kube-api-access-qq7zm\") pod \"horizon-operator-controller-manager-8464cc45fb-x7rnp\" (UID: \"124dc549-cb2a-4b1c-a610-093cf9b8c05d\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.663647 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.664515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.666042 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj4mr\" (UniqueName: \"kubernetes.io/projected/3390909b-6271-40dd-9662-0710f6866143-kube-api-access-hj4mr\") pod \"barbican-operator-controller-manager-59bc569d95-fsxlt\" (UID: \"3390909b-6271-40dd-9662-0710f6866143\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.669689 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxwjf\" (UniqueName: \"kubernetes.io/projected/710ababb-0bee-441d-8dd0-e6a72ea2b2e3-kube-api-access-kxwjf\") pod \"designate-operator-controller-manager-588d4d986b-7mbx2\" (UID: \"710ababb-0bee-441d-8dd0-e6a72ea2b2e3\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.674978 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7wkdx" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.691775 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698282 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f65c\" (UniqueName: \"kubernetes.io/projected/211c991a-9406-4360-aa7f-830be3aa55db-kube-api-access-7f65c\") pod \"manila-operator-controller-manager-55f864c847-zpc92\" (UID: \"211c991a-9406-4360-aa7f-830be3aa55db\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698322 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxrp2\" (UniqueName: \"kubernetes.io/projected/37675366-70a8-4e0b-b92b-f7055547d918-kube-api-access-bxrp2\") pod \"mariadb-operator-controller-manager-67ccfc9778-47sbc\" (UID: \"37675366-70a8-4e0b-b92b-f7055547d918\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698351 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lkm5\" (UniqueName: \"kubernetes.io/projected/66d3bf3a-086c-4340-ba73-209f526fc33c-kube-api-access-5lkm5\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698390 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js847\" (UniqueName: \"kubernetes.io/projected/3c86f76c-1617-45e9-9573-f6fd51803b45-kube-api-access-js847\") pod \"ironic-operator-controller-manager-6f787dddc9-fjjvl\" (UID: \"3c86f76c-1617-45e9-9573-f6fd51803b45\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl9j2\" (UniqueName: \"kubernetes.io/projected/e1ec7bae-8e15-4844-84d2-ff5951d0be31-kube-api-access-hl9j2\") pod \"keystone-operator-controller-manager-768b96df4c-5xvtc\" (UID: \"e1ec7bae-8e15-4844-84d2-ff5951d0be31\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwpvg\" (UniqueName: \"kubernetes.io/projected/b41dbd4a-33dd-4dca-9356-34c740e8063f-kube-api-access-jwpvg\") pod \"glance-operator-controller-manager-79df6bcc97-wb4pc\" (UID: \"b41dbd4a-33dd-4dca-9356-34c740e8063f\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.698485 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brz8f\" (UniqueName: \"kubernetes.io/projected/ae690990-eeb1-4871-8c51-dd3b547e1193-kube-api-access-brz8f\") pod \"neutron-operator-controller-manager-767865f676-k4r2p\" (UID: \"ae690990-eeb1-4871-8c51-dd3b547e1193\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.705002 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.706951 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.734744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lkm5\" (UniqueName: \"kubernetes.io/projected/66d3bf3a-086c-4340-ba73-209f526fc33c-kube-api-access-5lkm5\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.736005 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zpc92"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.746533 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwpvg\" (UniqueName: \"kubernetes.io/projected/b41dbd4a-33dd-4dca-9356-34c740e8063f-kube-api-access-jwpvg\") pod \"glance-operator-controller-manager-79df6bcc97-wb4pc\" (UID: \"b41dbd4a-33dd-4dca-9356-34c740e8063f\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.747240 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.767341 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.778912 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.785140 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.799867 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.803737 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js847\" (UniqueName: \"kubernetes.io/projected/3c86f76c-1617-45e9-9573-f6fd51803b45-kube-api-access-js847\") pod \"ironic-operator-controller-manager-6f787dddc9-fjjvl\" (UID: \"3c86f76c-1617-45e9-9573-f6fd51803b45\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.803792 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl9j2\" (UniqueName: \"kubernetes.io/projected/e1ec7bae-8e15-4844-84d2-ff5951d0be31-kube-api-access-hl9j2\") pod \"keystone-operator-controller-manager-768b96df4c-5xvtc\" (UID: \"e1ec7bae-8e15-4844-84d2-ff5951d0be31\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.803840 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brz8f\" (UniqueName: \"kubernetes.io/projected/ae690990-eeb1-4871-8c51-dd3b547e1193-kube-api-access-brz8f\") pod \"neutron-operator-controller-manager-767865f676-k4r2p\" (UID: \"ae690990-eeb1-4871-8c51-dd3b547e1193\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.804043 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f65c\" (UniqueName: \"kubernetes.io/projected/211c991a-9406-4360-aa7f-830be3aa55db-kube-api-access-7f65c\") pod \"manila-operator-controller-manager-55f864c847-zpc92\" (UID: \"211c991a-9406-4360-aa7f-830be3aa55db\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.804101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxrp2\" (UniqueName: \"kubernetes.io/projected/37675366-70a8-4e0b-b92b-f7055547d918-kube-api-access-bxrp2\") pod \"mariadb-operator-controller-manager-67ccfc9778-47sbc\" (UID: \"37675366-70a8-4e0b-b92b-f7055547d918\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.818569 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.821684 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.829313 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js847\" (UniqueName: \"kubernetes.io/projected/3c86f76c-1617-45e9-9573-f6fd51803b45-kube-api-access-js847\") pod \"ironic-operator-controller-manager-6f787dddc9-fjjvl\" (UID: \"3c86f76c-1617-45e9-9573-f6fd51803b45\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.829990 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxrp2\" (UniqueName: \"kubernetes.io/projected/37675366-70a8-4e0b-b92b-f7055547d918-kube-api-access-bxrp2\") pod \"mariadb-operator-controller-manager-67ccfc9778-47sbc\" (UID: \"37675366-70a8-4e0b-b92b-f7055547d918\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.831528 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jn5mq" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.837564 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.838553 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brz8f\" (UniqueName: \"kubernetes.io/projected/ae690990-eeb1-4871-8c51-dd3b547e1193-kube-api-access-brz8f\") pod \"neutron-operator-controller-manager-767865f676-k4r2p\" (UID: \"ae690990-eeb1-4871-8c51-dd3b547e1193\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.850275 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f65c\" (UniqueName: \"kubernetes.io/projected/211c991a-9406-4360-aa7f-830be3aa55db-kube-api-access-7f65c\") pod \"manila-operator-controller-manager-55f864c847-zpc92\" (UID: \"211c991a-9406-4360-aa7f-830be3aa55db\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.855062 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.858599 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl9j2\" (UniqueName: \"kubernetes.io/projected/e1ec7bae-8e15-4844-84d2-ff5951d0be31-kube-api-access-hl9j2\") pod \"keystone-operator-controller-manager-768b96df4c-5xvtc\" (UID: \"e1ec7bae-8e15-4844-84d2-ff5951d0be31\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.865819 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.900029 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.901364 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.908166 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-9j75f" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.921034 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.922245 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.926593 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gd4mk" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.927029 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.953546 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt"] Mar 18 09:20:05 crc kubenswrapper[4778]: I0318 09:20:05.987889 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-8p782"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:05.991988 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:05.995691 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-8p782"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.001067 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.007318 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.009370 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.010178 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.011065 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhg5h\" (UniqueName: \"kubernetes.io/projected/c776af1e-ad54-40fe-9bed-a0a09ce0eea7-kube-api-access-mhg5h\") pod \"octavia-operator-controller-manager-5b9f45d989-pzjdt\" (UID: \"c776af1e-ad54-40fe-9bed-a0a09ce0eea7\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.011127 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhg2s\" (UniqueName: \"kubernetes.io/projected/e245908e-e35e-403c-93f6-48371904ae42-kube-api-access-nhg2s\") pod \"nova-operator-controller-manager-5d488d59fb-h6whs\" (UID: \"e245908e-e35e-403c-93f6-48371904ae42\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.013520 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qvdrw" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.022282 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.030304 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.031251 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.044951 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.047467 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-nz525" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.082295 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.099372 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.100248 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.105944 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5qm2h" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117557 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117621 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmkvs\" (UniqueName: \"kubernetes.io/projected/208b26f2-3c91-4966-9d01-8fe73e4a7d87-kube-api-access-xmkvs\") pod \"ovn-operator-controller-manager-884679f54-fgfk9\" (UID: \"208b26f2-3c91-4966-9d01-8fe73e4a7d87\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117651 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrdr\" (UniqueName: \"kubernetes.io/projected/80822932-2943-4f81-9436-1553ed031359-kube-api-access-rvrdr\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117674 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhg5h\" (UniqueName: \"kubernetes.io/projected/c776af1e-ad54-40fe-9bed-a0a09ce0eea7-kube-api-access-mhg5h\") pod \"octavia-operator-controller-manager-5b9f45d989-pzjdt\" (UID: \"c776af1e-ad54-40fe-9bed-a0a09ce0eea7\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117697 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.117719 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhg2s\" (UniqueName: \"kubernetes.io/projected/e245908e-e35e-403c-93f6-48371904ae42-kube-api-access-nhg2s\") pod \"nova-operator-controller-manager-5d488d59fb-h6whs\" (UID: \"e245908e-e35e-403c-93f6-48371904ae42\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.118183 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.118242 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert podName:66d3bf3a-086c-4340-ba73-209f526fc33c nodeName:}" failed. No retries permitted until 2026-03-18 09:20:07.118229648 +0000 UTC m=+1073.692974488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert") pod "infra-operator-controller-manager-7b9c774f96-64c4x" (UID: "66d3bf3a-086c-4340-ba73-209f526fc33c") : secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.123956 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.153267 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.154090 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.157559 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-9pmgz" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.169901 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhg2s\" (UniqueName: \"kubernetes.io/projected/e245908e-e35e-403c-93f6-48371904ae42-kube-api-access-nhg2s\") pod \"nova-operator-controller-manager-5d488d59fb-h6whs\" (UID: \"e245908e-e35e-403c-93f6-48371904ae42\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.172156 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhg5h\" (UniqueName: \"kubernetes.io/projected/c776af1e-ad54-40fe-9bed-a0a09ce0eea7-kube-api-access-mhg5h\") pod \"octavia-operator-controller-manager-5b9f45d989-pzjdt\" (UID: \"c776af1e-ad54-40fe-9bed-a0a09ce0eea7\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.219413 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="913fd7d5-c271-4918-992c-95e6048faa85" path="/var/lib/kubelet/pods/913fd7d5-c271-4918-992c-95e6048faa85/volumes" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.220050 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.221265 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.221317 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkvs\" (UniqueName: \"kubernetes.io/projected/208b26f2-3c91-4966-9d01-8fe73e4a7d87-kube-api-access-xmkvs\") pod \"ovn-operator-controller-manager-884679f54-fgfk9\" (UID: \"208b26f2-3c91-4966-9d01-8fe73e4a7d87\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.221347 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrdr\" (UniqueName: \"kubernetes.io/projected/80822932-2943-4f81-9436-1553ed031359-kube-api-access-rvrdr\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.221369 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfp8\" (UniqueName: \"kubernetes.io/projected/8ccabb3b-da59-4ab0-89c8-99094a939f0d-kube-api-access-7qfp8\") pod \"swift-operator-controller-manager-c674c5965-c6l5k\" (UID: \"8ccabb3b-da59-4ab0-89c8-99094a939f0d\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.221407 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hglbk\" (UniqueName: \"kubernetes.io/projected/2f8e8860-00a1-43fc-9776-c617f270cc50-kube-api-access-hglbk\") pod \"placement-operator-controller-manager-5784578c99-d5w9q\" (UID: \"2f8e8860-00a1-43fc-9776-c617f270cc50\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.221523 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.221559 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert podName:80822932-2943-4f81-9436-1553ed031359 nodeName:}" failed. No retries permitted until 2026-03-18 09:20:06.721546574 +0000 UTC m=+1073.296291414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" (UID: "80822932-2943-4f81-9436-1553ed031359") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.245218 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmkvs\" (UniqueName: \"kubernetes.io/projected/208b26f2-3c91-4966-9d01-8fe73e4a7d87-kube-api-access-xmkvs\") pod \"ovn-operator-controller-manager-884679f54-fgfk9\" (UID: \"208b26f2-3c91-4966-9d01-8fe73e4a7d87\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.278931 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.282264 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.320582 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrdr\" (UniqueName: \"kubernetes.io/projected/80822932-2943-4f81-9436-1553ed031359-kube-api-access-rvrdr\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.371067 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfp8\" (UniqueName: \"kubernetes.io/projected/8ccabb3b-da59-4ab0-89c8-99094a939f0d-kube-api-access-7qfp8\") pod \"swift-operator-controller-manager-c674c5965-c6l5k\" (UID: \"8ccabb3b-da59-4ab0-89c8-99094a939f0d\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.371513 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hglbk\" (UniqueName: \"kubernetes.io/projected/2f8e8860-00a1-43fc-9776-c617f270cc50-kube-api-access-hglbk\") pod \"placement-operator-controller-manager-5784578c99-d5w9q\" (UID: \"2f8e8860-00a1-43fc-9776-c617f270cc50\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.372446 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8vhl\" (UniqueName: \"kubernetes.io/projected/9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77-kube-api-access-n8vhl\") pod \"telemetry-operator-controller-manager-d6b694c5-tx9zq\" (UID: \"9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.385033 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.388969 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-2d5qf" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.392798 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.411837 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hglbk\" (UniqueName: \"kubernetes.io/projected/2f8e8860-00a1-43fc-9776-c617f270cc50-kube-api-access-hglbk\") pod \"placement-operator-controller-manager-5784578c99-d5w9q\" (UID: \"2f8e8860-00a1-43fc-9776-c617f270cc50\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.423064 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfp8\" (UniqueName: \"kubernetes.io/projected/8ccabb3b-da59-4ab0-89c8-99094a939f0d-kube-api-access-7qfp8\") pod \"swift-operator-controller-manager-c674c5965-c6l5k\" (UID: \"8ccabb3b-da59-4ab0-89c8-99094a939f0d\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.447685 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.449544 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.456638 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.460988 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.473833 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8vhl\" (UniqueName: \"kubernetes.io/projected/9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77-kube-api-access-n8vhl\") pod \"telemetry-operator-controller-manager-d6b694c5-tx9zq\" (UID: \"9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.475387 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.481845 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-nfjkg" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.500584 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8vhl\" (UniqueName: \"kubernetes.io/projected/9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77-kube-api-access-n8vhl\") pod \"telemetry-operator-controller-manager-d6b694c5-tx9zq\" (UID: \"9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.506055 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.507050 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.509391 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-26qtx" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.511285 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.511400 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.523862 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.534345 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.535257 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.538673 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-w2tqv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.564142 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.577652 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9wb2\" (UniqueName: \"kubernetes.io/projected/57277339-c9be-4de1-8e35-72ae98d33905-kube-api-access-v9wb2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-sgs49\" (UID: \"57277339-c9be-4de1-8e35-72ae98d33905\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.578135 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v86w\" (UniqueName: \"kubernetes.io/projected/99adb6be-2a3e-4148-8074-9258222ebd60-kube-api-access-9v86w\") pod \"test-operator-controller-manager-54c5f5bc8-jsm76\" (UID: \"99adb6be-2a3e-4148-8074-9258222ebd60\") " pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.581770 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc"] Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.583642 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" event={"ID":"0526f654-9ddc-4495-bb04-be13e53b6a1b","Type":"ContainerStarted","Data":"eb8d809482f9c01c7a6535ee3ca7de1c4aa288bf8b0d49f0ffaf77bde9f11ffc"} Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.662881 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680603 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hljh\" (UniqueName: \"kubernetes.io/projected/3c7e3158-5139-467d-b33c-808747f0d9be-kube-api-access-7hljh\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680647 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8qp9\" (UniqueName: \"kubernetes.io/projected/b837636e-8c09-42b7-9a81-e7875df68344-kube-api-access-g8qp9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5jrv8\" (UID: \"b837636e-8c09-42b7-9a81-e7875df68344\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680914 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9wb2\" (UniqueName: \"kubernetes.io/projected/57277339-c9be-4de1-8e35-72ae98d33905-kube-api-access-v9wb2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-sgs49\" (UID: \"57277339-c9be-4de1-8e35-72ae98d33905\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680934 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v86w\" (UniqueName: \"kubernetes.io/projected/99adb6be-2a3e-4148-8074-9258222ebd60-kube-api-access-9v86w\") pod \"test-operator-controller-manager-54c5f5bc8-jsm76\" (UID: \"99adb6be-2a3e-4148-8074-9258222ebd60\") " pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680954 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.680980 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.700623 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v86w\" (UniqueName: \"kubernetes.io/projected/99adb6be-2a3e-4148-8074-9258222ebd60-kube-api-access-9v86w\") pod \"test-operator-controller-manager-54c5f5bc8-jsm76\" (UID: \"99adb6be-2a3e-4148-8074-9258222ebd60\") " pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.702468 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9wb2\" (UniqueName: \"kubernetes.io/projected/57277339-c9be-4de1-8e35-72ae98d33905-kube-api-access-v9wb2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-sgs49\" (UID: \"57277339-c9be-4de1-8e35-72ae98d33905\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.711897 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.741077 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.781654 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.782240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.782508 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hljh\" (UniqueName: \"kubernetes.io/projected/3c7e3158-5139-467d-b33c-808747f0d9be-kube-api-access-7hljh\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.782542 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8qp9\" (UniqueName: \"kubernetes.io/projected/b837636e-8c09-42b7-9a81-e7875df68344-kube-api-access-g8qp9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5jrv8\" (UID: \"b837636e-8c09-42b7-9a81-e7875df68344\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.782620 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.782782 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.782971 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.783041 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:07.283018328 +0000 UTC m=+1073.857763168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.783438 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.783512 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:07.28348592 +0000 UTC m=+1073.858230770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "metrics-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.783688 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: E0318 09:20:06.783760 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert podName:80822932-2943-4f81-9436-1553ed031359 nodeName:}" failed. No retries permitted until 2026-03-18 09:20:07.783736877 +0000 UTC m=+1074.358481897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" (UID: "80822932-2943-4f81-9436-1553ed031359") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.811722 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8qp9\" (UniqueName: \"kubernetes.io/projected/b837636e-8c09-42b7-9a81-e7875df68344-kube-api-access-g8qp9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5jrv8\" (UID: \"b837636e-8c09-42b7-9a81-e7875df68344\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.811981 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hljh\" (UniqueName: \"kubernetes.io/projected/3c7e3158-5139-467d-b33c-808747f0d9be-kube-api-access-7hljh\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.824223 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:06 crc kubenswrapper[4778]: I0318 09:20:06.926578 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.034294 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt"] Mar 18 09:20:07 crc kubenswrapper[4778]: W0318 09:20:07.048336 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3390909b_6271_40dd_9662_0710f6866143.slice/crio-086c9a7df1f7230c0b32e2266231d5ffe8dcd8589f29376e627e50b57e82371b WatchSource:0}: Error finding container 086c9a7df1f7230c0b32e2266231d5ffe8dcd8589f29376e627e50b57e82371b: Status 404 returned error can't find the container with id 086c9a7df1f7230c0b32e2266231d5ffe8dcd8589f29376e627e50b57e82371b Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.137438 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.146768 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w"] Mar 18 09:20:07 crc kubenswrapper[4778]: W0318 09:20:07.150781 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae690990_eeb1_4871_8c51_dd3b547e1193.slice/crio-b58a71c985266ccba8ccd3164eb13041ddf54915bc6689a49db9964daf1899b2 WatchSource:0}: Error finding container b58a71c985266ccba8ccd3164eb13041ddf54915bc6689a49db9964daf1899b2: Status 404 returned error can't find the container with id b58a71c985266ccba8ccd3164eb13041ddf54915bc6689a49db9964daf1899b2 Mar 18 09:20:07 crc kubenswrapper[4778]: W0318 09:20:07.152364 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaceb2f7b_585f_451a_83b8_e673965ada87.slice/crio-f80fdaa618cf362fa439c3bc2438737726c2fbec6e449966c37f37aac4ddbbd5 WatchSource:0}: Error finding container f80fdaa618cf362fa439c3bc2438737726c2fbec6e449966c37f37aac4ddbbd5: Status 404 returned error can't find the container with id f80fdaa618cf362fa439c3bc2438737726c2fbec6e449966c37f37aac4ddbbd5 Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.169614 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.194096 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.194434 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.194792 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert podName:66d3bf3a-086c-4340-ba73-209f526fc33c nodeName:}" failed. No retries permitted until 2026-03-18 09:20:09.194771557 +0000 UTC m=+1075.769516397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert") pod "infra-operator-controller-manager-7b9c774f96-64c4x" (UID: "66d3bf3a-086c-4340-ba73-209f526fc33c") : secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.295329 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.295387 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.295553 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.295552 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.295610 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:08.295593404 +0000 UTC m=+1074.870338244 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "metrics-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.295638 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:08.295620835 +0000 UTC m=+1074.870365675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "webhook-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.564477 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.583102 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.606383 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.644221 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.645581 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" event={"ID":"3390909b-6271-40dd-9662-0710f6866143","Type":"ContainerStarted","Data":"086c9a7df1f7230c0b32e2266231d5ffe8dcd8589f29376e627e50b57e82371b"} Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.666372 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" event={"ID":"710ababb-0bee-441d-8dd0-e6a72ea2b2e3","Type":"ContainerStarted","Data":"701f288d7559807f8c005c2e53b1c0bc2db223255c618f71e329c0fb88c12c9f"} Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.677452 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" event={"ID":"ae690990-eeb1-4871-8c51-dd3b547e1193","Type":"ContainerStarted","Data":"b58a71c985266ccba8ccd3164eb13041ddf54915bc6689a49db9964daf1899b2"} Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.683599 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zpc92"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.690743 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.691055 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" event={"ID":"aceb2f7b-585f-451a-83b8-e673965ada87","Type":"ContainerStarted","Data":"f80fdaa618cf362fa439c3bc2438737726c2fbec6e449966c37f37aac4ddbbd5"} Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.692229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" event={"ID":"208b26f2-3c91-4966-9d01-8fe73e4a7d87","Type":"ContainerStarted","Data":"c03f94117048ef2701ab7fc3d7574fab74dac0b36c93a529221c3fad573c76ae"} Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.703226 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc"] Mar 18 09:20:07 crc kubenswrapper[4778]: I0318 09:20:07.816790 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.817075 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:07 crc kubenswrapper[4778]: E0318 09:20:07.817127 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert podName:80822932-2943-4f81-9436-1553ed031359 nodeName:}" failed. No retries permitted until 2026-03-18 09:20:09.817110705 +0000 UTC m=+1076.391855545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" (UID: "80822932-2943-4f81-9436-1553ed031359") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.033293 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q"] Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.049558 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8"] Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.056067 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt"] Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.105577 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76"] Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.113553 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.94:5001/openstack-k8s-operators/test-operator:9967a65233f8c87751fea24bb23667f563a71e91,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9v86w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-54c5f5bc8-jsm76_openstack-operators(99adb6be-2a3e-4148-8074-9258222ebd60): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.114942 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" podUID="99adb6be-2a3e-4148-8074-9258222ebd60" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.120129 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49"] Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.127875 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k"] Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.136046 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq"] Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.137115 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7qfp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-c6l5k_openstack-operators(8ccabb3b-da59-4ab0-89c8-99094a939f0d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.138242 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" podUID="8ccabb3b-da59-4ab0-89c8-99094a939f0d" Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.171070 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n8vhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-tx9zq_openstack-operators(9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.172668 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" podUID="9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.298127 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs"] Mar 18 09:20:08 crc kubenswrapper[4778]: W0318 09:20:08.306899 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode245908e_e35e_403c_93f6_48371904ae42.slice/crio-01d8b950e77b0b6197f6cd86545e3fa36bc1bd06f7bf5183be587ebc228ab213 WatchSource:0}: Error finding container 01d8b950e77b0b6197f6cd86545e3fa36bc1bd06f7bf5183be587ebc228ab213: Status 404 returned error can't find the container with id 01d8b950e77b0b6197f6cd86545e3fa36bc1bd06f7bf5183be587ebc228ab213 Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.364294 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.364441 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.364463 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.364535 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:10.364514903 +0000 UTC m=+1076.939259743 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "webhook-server-cert" not found Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.364613 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.364696 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:10.364677918 +0000 UTC m=+1076.939422838 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "metrics-server-cert" not found Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.734025 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" event={"ID":"9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77","Type":"ContainerStarted","Data":"795c63e5cf86fe706d352e5445e86b98a2b4c82f8f2ea1e6362fcb144e9e73b1"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.739928 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" event={"ID":"57277339-c9be-4de1-8e35-72ae98d33905","Type":"ContainerStarted","Data":"51c0c3efe70fa092841fcb028a10ddaf784e0b252d86815df04fbbbf37aa2629"} Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.740061 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" podUID="9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.747474 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" event={"ID":"211c991a-9406-4360-aa7f-830be3aa55db","Type":"ContainerStarted","Data":"01d059d0646cde57bf5523de3e51f528caaa9b9bc794c54a4ca930e45012c328"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.763292 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" event={"ID":"99adb6be-2a3e-4148-8074-9258222ebd60","Type":"ContainerStarted","Data":"fc35a36a6691c170ce0569d7b5df285f4f48fea713f2d75bf7afe3371adac118"} Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.768582 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.94:5001/openstack-k8s-operators/test-operator:9967a65233f8c87751fea24bb23667f563a71e91\\\"\"" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" podUID="99adb6be-2a3e-4148-8074-9258222ebd60" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.801409 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" event={"ID":"e1ec7bae-8e15-4844-84d2-ff5951d0be31","Type":"ContainerStarted","Data":"fc19af69754f5f4ca9176ab7c8b953e0ef52b3a725ae68203cbb3f2c37e5c075"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.813619 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" event={"ID":"8ccabb3b-da59-4ab0-89c8-99094a939f0d","Type":"ContainerStarted","Data":"1c88c87f30c9adb675e35054d8a0ccb0acf8edfdc742e0e34ca13af1fdc0a714"} Mar 18 09:20:08 crc kubenswrapper[4778]: E0318 09:20:08.817805 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" podUID="8ccabb3b-da59-4ab0-89c8-99094a939f0d" Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.831683 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" event={"ID":"e245908e-e35e-403c-93f6-48371904ae42","Type":"ContainerStarted","Data":"01d8b950e77b0b6197f6cd86545e3fa36bc1bd06f7bf5183be587ebc228ab213"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.863367 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" event={"ID":"37675366-70a8-4e0b-b92b-f7055547d918","Type":"ContainerStarted","Data":"5855588df5d3c9957d6cdf1268c1f0e433a65c63dc5c72a9001b1e42ffdc55aa"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.884326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" event={"ID":"3c86f76c-1617-45e9-9573-f6fd51803b45","Type":"ContainerStarted","Data":"d0e1e9d9222e9ad294c135a60aa2d7652c89b85238929a2ae3e6aff5506a4e6c"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.887056 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" event={"ID":"124dc549-cb2a-4b1c-a610-093cf9b8c05d","Type":"ContainerStarted","Data":"f486ef9a0d7196edfde13693bef43f9bccd606f873a790f26e62a45e9993c955"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.889599 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" event={"ID":"2f8e8860-00a1-43fc-9776-c617f270cc50","Type":"ContainerStarted","Data":"6ec80f52a1552e15b2b0561d37e4314c55f4023073d7e65b7c70b2200590abb6"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.891636 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" event={"ID":"c776af1e-ad54-40fe-9bed-a0a09ce0eea7","Type":"ContainerStarted","Data":"fec3675798b6b01b76c3f155f0804049d1c805f3939460649722db5890f0911d"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.902034 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" event={"ID":"b41dbd4a-33dd-4dca-9356-34c740e8063f","Type":"ContainerStarted","Data":"80c11d26efe778bd105fd3254ee2e04c03127cb3a8e73e18271fa73d7b5e6dac"} Mar 18 09:20:08 crc kubenswrapper[4778]: I0318 09:20:08.910490 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" event={"ID":"b837636e-8c09-42b7-9a81-e7875df68344","Type":"ContainerStarted","Data":"4bea5c9de09771619869377e6c319249ef6faa2b9bb923b1dd81aeed42b5484a"} Mar 18 09:20:09 crc kubenswrapper[4778]: I0318 09:20:09.280500 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.280707 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.280786 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert podName:66d3bf3a-086c-4340-ba73-209f526fc33c nodeName:}" failed. No retries permitted until 2026-03-18 09:20:13.280767258 +0000 UTC m=+1079.855512098 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert") pod "infra-operator-controller-manager-7b9c774f96-64c4x" (UID: "66d3bf3a-086c-4340-ba73-209f526fc33c") : secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:09 crc kubenswrapper[4778]: I0318 09:20:09.894510 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.894877 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.895049 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert podName:80822932-2943-4f81-9436-1553ed031359 nodeName:}" failed. No retries permitted until 2026-03-18 09:20:13.894968214 +0000 UTC m=+1080.469713054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" (UID: "80822932-2943-4f81-9436-1553ed031359") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.946325 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" podUID="8ccabb3b-da59-4ab0-89c8-99094a939f0d" Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.946436 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.94:5001/openstack-k8s-operators/test-operator:9967a65233f8c87751fea24bb23667f563a71e91\\\"\"" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" podUID="99adb6be-2a3e-4148-8074-9258222ebd60" Mar 18 09:20:09 crc kubenswrapper[4778]: E0318 09:20:09.946491 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" podUID="9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77" Mar 18 09:20:10 crc kubenswrapper[4778]: I0318 09:20:10.402691 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:10 crc kubenswrapper[4778]: I0318 09:20:10.402750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:10 crc kubenswrapper[4778]: E0318 09:20:10.402862 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 09:20:10 crc kubenswrapper[4778]: E0318 09:20:10.402914 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:14.402899964 +0000 UTC m=+1080.977644804 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "metrics-server-cert" not found Mar 18 09:20:10 crc kubenswrapper[4778]: E0318 09:20:10.402919 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 09:20:10 crc kubenswrapper[4778]: E0318 09:20:10.402993 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:14.402976366 +0000 UTC m=+1080.977721206 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "webhook-server-cert" not found Mar 18 09:20:13 crc kubenswrapper[4778]: I0318 09:20:13.355178 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:13 crc kubenswrapper[4778]: E0318 09:20:13.355977 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:13 crc kubenswrapper[4778]: E0318 09:20:13.357701 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert podName:66d3bf3a-086c-4340-ba73-209f526fc33c nodeName:}" failed. No retries permitted until 2026-03-18 09:20:21.357666562 +0000 UTC m=+1087.932411402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert") pod "infra-operator-controller-manager-7b9c774f96-64c4x" (UID: "66d3bf3a-086c-4340-ba73-209f526fc33c") : secret "infra-operator-webhook-server-cert" not found Mar 18 09:20:13 crc kubenswrapper[4778]: I0318 09:20:13.966895 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:13 crc kubenswrapper[4778]: E0318 09:20:13.967011 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:13 crc kubenswrapper[4778]: E0318 09:20:13.967067 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert podName:80822932-2943-4f81-9436-1553ed031359 nodeName:}" failed. No retries permitted until 2026-03-18 09:20:21.967051116 +0000 UTC m=+1088.541795956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" (UID: "80822932-2943-4f81-9436-1553ed031359") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 09:20:14 crc kubenswrapper[4778]: I0318 09:20:14.493405 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:14 crc kubenswrapper[4778]: E0318 09:20:14.493635 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 09:20:14 crc kubenswrapper[4778]: E0318 09:20:14.494185 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:22.494152389 +0000 UTC m=+1089.068897439 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "webhook-server-cert" not found Mar 18 09:20:14 crc kubenswrapper[4778]: E0318 09:20:14.494328 4778 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 09:20:14 crc kubenswrapper[4778]: E0318 09:20:14.494434 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs podName:3c7e3158-5139-467d-b33c-808747f0d9be nodeName:}" failed. No retries permitted until 2026-03-18 09:20:22.494411166 +0000 UTC m=+1089.069156006 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs") pod "openstack-operator-controller-manager-f5c7df4d7-m4kvr" (UID: "3c7e3158-5139-467d-b33c-808747f0d9be") : secret "metrics-server-cert" not found Mar 18 09:20:14 crc kubenswrapper[4778]: I0318 09:20:14.494070 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:19 crc kubenswrapper[4778]: I0318 09:20:19.990948 4778 scope.go:117] "RemoveContainer" containerID="623e10ff390eb7e19703ecca951ddfb9b57c997906e5eae908a2c7aedc17d0d1" Mar 18 09:20:20 crc kubenswrapper[4778]: E0318 09:20:20.569978 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8" Mar 18 09:20:20 crc kubenswrapper[4778]: E0318 09:20:20.570521 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-js847,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6f787dddc9-fjjvl_openstack-operators(3c86f76c-1617-45e9-9573-f6fd51803b45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:20:20 crc kubenswrapper[4778]: E0318 09:20:20.571773 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" podUID="3c86f76c-1617-45e9-9573-f6fd51803b45" Mar 18 09:20:21 crc kubenswrapper[4778]: E0318 09:20:21.059054 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" podUID="3c86f76c-1617-45e9-9573-f6fd51803b45" Mar 18 09:20:21 crc kubenswrapper[4778]: E0318 09:20:21.236441 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a" Mar 18 09:20:21 crc kubenswrapper[4778]: E0318 09:20:21.236652 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-brz8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-k4r2p_openstack-operators(ae690990-eeb1-4871-8c51-dd3b547e1193): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:20:21 crc kubenswrapper[4778]: E0318 09:20:21.238691 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" podUID="ae690990-eeb1-4871-8c51-dd3b547e1193" Mar 18 09:20:21 crc kubenswrapper[4778]: I0318 09:20:21.412349 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:21 crc kubenswrapper[4778]: I0318 09:20:21.418087 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66d3bf3a-086c-4340-ba73-209f526fc33c-cert\") pod \"infra-operator-controller-manager-7b9c774f96-64c4x\" (UID: \"66d3bf3a-086c-4340-ba73-209f526fc33c\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:21 crc kubenswrapper[4778]: I0318 09:20:21.710957 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dpc7h" Mar 18 09:20:21 crc kubenswrapper[4778]: I0318 09:20:21.719889 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.021266 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.026143 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80822932-2943-4f81-9436-1553ed031359-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-xdgmv\" (UID: \"80822932-2943-4f81-9436-1553ed031359\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:22 crc kubenswrapper[4778]: E0318 09:20:22.065488 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" podUID="ae690990-eeb1-4871-8c51-dd3b547e1193" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.117546 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gd4mk" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.125942 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:22 crc kubenswrapper[4778]: E0318 09:20:22.388010 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 18 09:20:22 crc kubenswrapper[4778]: E0318 09:20:22.388278 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nhg2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-h6whs_openstack-operators(e245908e-e35e-403c-93f6-48371904ae42): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:20:22 crc kubenswrapper[4778]: E0318 09:20:22.390514 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" podUID="e245908e-e35e-403c-93f6-48371904ae42" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.529459 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.529813 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.534667 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-webhook-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.537881 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c7e3158-5139-467d-b33c-808747f0d9be-metrics-certs\") pod \"openstack-operator-controller-manager-f5c7df4d7-m4kvr\" (UID: \"3c7e3158-5139-467d-b33c-808747f0d9be\") " pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.760138 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-26qtx" Mar 18 09:20:22 crc kubenswrapper[4778]: I0318 09:20:22.769400 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:23 crc kubenswrapper[4778]: E0318 09:20:23.074437 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" podUID="e245908e-e35e-403c-93f6-48371904ae42" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.064906 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.065178 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hl9j2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-5xvtc_openstack-operators(e1ec7bae-8e15-4844-84d2-ff5951d0be31): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.066406 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" podUID="e1ec7bae-8e15-4844-84d2-ff5951d0be31" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.079372 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" podUID="e1ec7bae-8e15-4844-84d2-ff5951d0be31" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.468508 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.468722 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g8qp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-5jrv8_openstack-operators(b837636e-8c09-42b7-9a81-e7875df68344): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:20:24 crc kubenswrapper[4778]: E0318 09:20:24.469975 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" podUID="b837636e-8c09-42b7-9a81-e7875df68344" Mar 18 09:20:25 crc kubenswrapper[4778]: E0318 09:20:25.087337 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" podUID="b837636e-8c09-42b7-9a81-e7875df68344" Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.106265 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" event={"ID":"0526f654-9ddc-4495-bb04-be13e53b6a1b","Type":"ContainerStarted","Data":"10bf04d12f3856f94bc165a7fd6d46930e83ea30fa0da8c1e312fe336a1fe54b"} Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.107488 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.111742 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" event={"ID":"211c991a-9406-4360-aa7f-830be3aa55db","Type":"ContainerStarted","Data":"bf98bbdf61a4bc691bfbbddd3b880e6bd38fb98138fdb140997663f96b0910b3"} Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.112027 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.125284 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" podStartSLOduration=4.25927371 podStartE2EDuration="21.125268154s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:06.564279955 +0000 UTC m=+1073.139024795" lastFinishedPulling="2026-03-18 09:20:23.430274389 +0000 UTC m=+1090.005019239" observedRunningTime="2026-03-18 09:20:26.1243838 +0000 UTC m=+1092.699128640" watchObservedRunningTime="2026-03-18 09:20:26.125268154 +0000 UTC m=+1092.700012994" Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.149367 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" podStartSLOduration=5.374169649 podStartE2EDuration="21.149340603s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.655069604 +0000 UTC m=+1074.229814444" lastFinishedPulling="2026-03-18 09:20:23.430240558 +0000 UTC m=+1090.004985398" observedRunningTime="2026-03-18 09:20:26.142025813 +0000 UTC m=+1092.716770653" watchObservedRunningTime="2026-03-18 09:20:26.149340603 +0000 UTC m=+1092.724085463" Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.198582 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv"] Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.204731 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x"] Mar 18 09:20:26 crc kubenswrapper[4778]: W0318 09:20:26.221251 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66d3bf3a_086c_4340_ba73_209f526fc33c.slice/crio-cb6071b57d3c52aa80cda85c852323bf1b1a367f19786fc37b5f27315ce3b67c WatchSource:0}: Error finding container cb6071b57d3c52aa80cda85c852323bf1b1a367f19786fc37b5f27315ce3b67c: Status 404 returned error can't find the container with id cb6071b57d3c52aa80cda85c852323bf1b1a367f19786fc37b5f27315ce3b67c Mar 18 09:20:26 crc kubenswrapper[4778]: I0318 09:20:26.315389 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr"] Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.131056 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" event={"ID":"208b26f2-3c91-4966-9d01-8fe73e4a7d87","Type":"ContainerStarted","Data":"7a45d06efb24496c3b891685b78f23ada48de52d50c828ce049b9e7165130516"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.132726 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.140007 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" event={"ID":"66d3bf3a-086c-4340-ba73-209f526fc33c","Type":"ContainerStarted","Data":"cb6071b57d3c52aa80cda85c852323bf1b1a367f19786fc37b5f27315ce3b67c"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.156076 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" event={"ID":"8ccabb3b-da59-4ab0-89c8-99094a939f0d","Type":"ContainerStarted","Data":"678ceaef4518db718dfe9de9d0268dda8a53fdd582caeb325e04a0a58249c4fb"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.157362 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.171290 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" event={"ID":"99adb6be-2a3e-4148-8074-9258222ebd60","Type":"ContainerStarted","Data":"42833b94d7023c6cf9fde2bddf9b46176c05e8b6acb5cf31fc6ccbde134e302d"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.171804 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" podStartSLOduration=5.317619714 podStartE2EDuration="22.171770772s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.601369946 +0000 UTC m=+1074.176114776" lastFinishedPulling="2026-03-18 09:20:24.455520994 +0000 UTC m=+1091.030265834" observedRunningTime="2026-03-18 09:20:27.160649608 +0000 UTC m=+1093.735394448" watchObservedRunningTime="2026-03-18 09:20:27.171770772 +0000 UTC m=+1093.746515622" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.171924 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.191477 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" event={"ID":"2f8e8860-00a1-43fc-9776-c617f270cc50","Type":"ContainerStarted","Data":"f79ab034e094322537d45810b2420051cb2dde668bd2d4935e9dc51eb4f1d22f"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.192002 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.208861 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" podStartSLOduration=4.645056499 podStartE2EDuration="22.208840865s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.137003152 +0000 UTC m=+1074.711747992" lastFinishedPulling="2026-03-18 09:20:25.700787488 +0000 UTC m=+1092.275532358" observedRunningTime="2026-03-18 09:20:27.195314625 +0000 UTC m=+1093.770059495" watchObservedRunningTime="2026-03-18 09:20:27.208840865 +0000 UTC m=+1093.783585705" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.231997 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" podStartSLOduration=5.839454491 podStartE2EDuration="22.231970307s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.078556354 +0000 UTC m=+1074.653301184" lastFinishedPulling="2026-03-18 09:20:24.47107213 +0000 UTC m=+1091.045817000" observedRunningTime="2026-03-18 09:20:27.229566132 +0000 UTC m=+1093.804310982" watchObservedRunningTime="2026-03-18 09:20:27.231970307 +0000 UTC m=+1093.806715157" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.234253 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" event={"ID":"c776af1e-ad54-40fe-9bed-a0a09ce0eea7","Type":"ContainerStarted","Data":"c03f2e759ed3cbf30fc2fa8a252e5497da1d91b431f59aa059d1958afe02ef1a"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.234922 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.246921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" event={"ID":"124dc549-cb2a-4b1c-a610-093cf9b8c05d","Type":"ContainerStarted","Data":"7c873f952c4bff30d1419439579f8ff651e5b4f9ff778ec6e1ae794557fe1526"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.247876 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.252232 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" event={"ID":"37675366-70a8-4e0b-b92b-f7055547d918","Type":"ContainerStarted","Data":"27fa6dfa89e1761c5b611170d6c156f6819bf15f781bd8c4a987cb3a232b83b0"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.252949 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.265279 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" event={"ID":"57277339-c9be-4de1-8e35-72ae98d33905","Type":"ContainerStarted","Data":"1a7359941866285e9344a5e7853a9371e2f69381f723ff5130bbfff9e657184d"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.265317 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.270933 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" podStartSLOduration=4.417172707 podStartE2EDuration="22.270916512s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.113395636 +0000 UTC m=+1074.688140476" lastFinishedPulling="2026-03-18 09:20:25.967139441 +0000 UTC m=+1092.541884281" observedRunningTime="2026-03-18 09:20:27.27007373 +0000 UTC m=+1093.844818580" watchObservedRunningTime="2026-03-18 09:20:27.270916512 +0000 UTC m=+1093.845661352" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.273634 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" event={"ID":"3c7e3158-5139-467d-b33c-808747f0d9be","Type":"ContainerStarted","Data":"9b2f692199c8fce2bd61b2d59ec71db9b2cf3dda9463e173099bb31fd1f733e1"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.273670 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" event={"ID":"3c7e3158-5139-467d-b33c-808747f0d9be","Type":"ContainerStarted","Data":"055f1a64f06bc32c86f16846259baa73a25844ecbe122df218cea6767810641f"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.275403 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.285345 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" event={"ID":"710ababb-0bee-441d-8dd0-e6a72ea2b2e3","Type":"ContainerStarted","Data":"66b7dc36df17775e00bc5d8587cda6ec4a0e1a6948129872c69dcc4efdf2c3d5"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.285731 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.287150 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" event={"ID":"80822932-2943-4f81-9436-1553ed031359","Type":"ContainerStarted","Data":"5fb40edc8c76a7399e800953259f175cf2429abe682dff519cc6ca4906398e7b"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.302520 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" event={"ID":"b41dbd4a-33dd-4dca-9356-34c740e8063f","Type":"ContainerStarted","Data":"ec124d7e4c28ea2b93a76b9518ca4879cc506d3a3dd8e671b0b2b0dda809e9bf"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.302753 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.315035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" event={"ID":"3390909b-6271-40dd-9662-0710f6866143","Type":"ContainerStarted","Data":"4933e2c29e48f2445c253a18602ca70a7c13376e97dd6eb9fa8b1c5059569f9e"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.315674 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.323136 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" podStartSLOduration=5.955624559 podStartE2EDuration="22.32310058s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.089694409 +0000 UTC m=+1074.664439259" lastFinishedPulling="2026-03-18 09:20:24.45717041 +0000 UTC m=+1091.031915280" observedRunningTime="2026-03-18 09:20:27.307910574 +0000 UTC m=+1093.882655414" watchObservedRunningTime="2026-03-18 09:20:27.32310058 +0000 UTC m=+1093.897845430" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.337184 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" event={"ID":"aceb2f7b-585f-451a-83b8-e673965ada87","Type":"ContainerStarted","Data":"9fd491aa6d7d1a94922fc9bacf35404b80a5b1f87c84d423d42b5e68bfe9d905"} Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.337241 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.355064 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" podStartSLOduration=5.574947249 podStartE2EDuration="22.355047973s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.675646256 +0000 UTC m=+1074.250391096" lastFinishedPulling="2026-03-18 09:20:24.45574695 +0000 UTC m=+1091.030491820" observedRunningTime="2026-03-18 09:20:27.350656453 +0000 UTC m=+1093.925401293" watchObservedRunningTime="2026-03-18 09:20:27.355047973 +0000 UTC m=+1093.929792813" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.386460 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" podStartSLOduration=5.553530612 podStartE2EDuration="22.386445471s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.622334448 +0000 UTC m=+1074.197079288" lastFinishedPulling="2026-03-18 09:20:24.455249307 +0000 UTC m=+1091.029994147" observedRunningTime="2026-03-18 09:20:27.381877757 +0000 UTC m=+1093.956622597" watchObservedRunningTime="2026-03-18 09:20:27.386445471 +0000 UTC m=+1093.961190311" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.417110 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" podStartSLOduration=6.011904237 podStartE2EDuration="22.417091039s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.101719397 +0000 UTC m=+1074.676464237" lastFinishedPulling="2026-03-18 09:20:24.506906209 +0000 UTC m=+1091.081651039" observedRunningTime="2026-03-18 09:20:27.413232224 +0000 UTC m=+1093.987977074" watchObservedRunningTime="2026-03-18 09:20:27.417091039 +0000 UTC m=+1093.991835879" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.441557 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" podStartSLOduration=5.146831552 podStartE2EDuration="22.441525668s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.160663324 +0000 UTC m=+1073.735408164" lastFinishedPulling="2026-03-18 09:20:24.45535743 +0000 UTC m=+1091.030102280" observedRunningTime="2026-03-18 09:20:27.438405232 +0000 UTC m=+1094.013150102" watchObservedRunningTime="2026-03-18 09:20:27.441525668 +0000 UTC m=+1094.016270508" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.471567 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" podStartSLOduration=6.106378612 podStartE2EDuration="22.471547039s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.67576792 +0000 UTC m=+1074.250512760" lastFinishedPulling="2026-03-18 09:20:24.040936347 +0000 UTC m=+1090.615681187" observedRunningTime="2026-03-18 09:20:27.467525078 +0000 UTC m=+1094.042269928" watchObservedRunningTime="2026-03-18 09:20:27.471547039 +0000 UTC m=+1094.046291879" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.512132 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" podStartSLOduration=5.258149227 podStartE2EDuration="22.512111108s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.202671843 +0000 UTC m=+1073.777416683" lastFinishedPulling="2026-03-18 09:20:24.456633724 +0000 UTC m=+1091.031378564" observedRunningTime="2026-03-18 09:20:27.499335149 +0000 UTC m=+1094.074079999" watchObservedRunningTime="2026-03-18 09:20:27.512111108 +0000 UTC m=+1094.086855948" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.535513 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" podStartSLOduration=5.143072519 podStartE2EDuration="22.535492097s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.063258361 +0000 UTC m=+1073.638003201" lastFinishedPulling="2026-03-18 09:20:24.455677939 +0000 UTC m=+1091.030422779" observedRunningTime="2026-03-18 09:20:27.532457144 +0000 UTC m=+1094.107201994" watchObservedRunningTime="2026-03-18 09:20:27.535492097 +0000 UTC m=+1094.110236937" Mar 18 09:20:27 crc kubenswrapper[4778]: I0318 09:20:27.578002 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" podStartSLOduration=21.577986969 podStartE2EDuration="21.577986969s" podCreationTimestamp="2026-03-18 09:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:20:27.572960462 +0000 UTC m=+1094.147705312" watchObservedRunningTime="2026-03-18 09:20:27.577986969 +0000 UTC m=+1094.152731809" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.374719 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" event={"ID":"9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77","Type":"ContainerStarted","Data":"dce6359d66344d37b58bb6fff39a43da4941a5d8668915d38bbb48dcbe686103"} Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.375558 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.376698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" event={"ID":"66d3bf3a-086c-4340-ba73-209f526fc33c","Type":"ContainerStarted","Data":"3242e3b292a8b48b95f430fe27b293a4ff6d9ed3fa810741531fceee5082a9b4"} Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.376850 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.378880 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" event={"ID":"80822932-2943-4f81-9436-1553ed031359","Type":"ContainerStarted","Data":"6f8410dee7c497f7a0937afe00f1c0314e7925466d7664ef8e7c6c42c9a1005f"} Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.379135 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.397990 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" podStartSLOduration=3.674537561 podStartE2EDuration="26.397947786s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.170654873 +0000 UTC m=+1074.745399713" lastFinishedPulling="2026-03-18 09:20:30.894065098 +0000 UTC m=+1097.468809938" observedRunningTime="2026-03-18 09:20:31.395680544 +0000 UTC m=+1097.970425424" watchObservedRunningTime="2026-03-18 09:20:31.397947786 +0000 UTC m=+1097.972692666" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.424229 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" podStartSLOduration=21.731163253 podStartE2EDuration="26.424178034s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:26.227620564 +0000 UTC m=+1092.802365404" lastFinishedPulling="2026-03-18 09:20:30.920635315 +0000 UTC m=+1097.495380185" observedRunningTime="2026-03-18 09:20:31.422055626 +0000 UTC m=+1097.996800496" watchObservedRunningTime="2026-03-18 09:20:31.424178034 +0000 UTC m=+1097.998922894" Mar 18 09:20:31 crc kubenswrapper[4778]: I0318 09:20:31.460553 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" podStartSLOduration=21.769118062 podStartE2EDuration="26.460524068s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:26.230439231 +0000 UTC m=+1092.805184071" lastFinishedPulling="2026-03-18 09:20:30.921845217 +0000 UTC m=+1097.496590077" observedRunningTime="2026-03-18 09:20:31.449709022 +0000 UTC m=+1098.024453872" watchObservedRunningTime="2026-03-18 09:20:31.460524068 +0000 UTC m=+1098.035268918" Mar 18 09:20:32 crc kubenswrapper[4778]: I0318 09:20:32.777257 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-f5c7df4d7-m4kvr" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.695723 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-fsxlt" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.716103 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wxftc" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.750732 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-7mbx2" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.779753 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-t5c4w" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.784884 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wb4pc" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.803904 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x7rnp" Mar 18 09:20:35 crc kubenswrapper[4778]: I0318 09:20:35.996620 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-47sbc" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.010404 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpc92" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.282056 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.485761 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pzjdt" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.667000 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d5w9q" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.716529 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-c6l5k" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.747369 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-tx9zq" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.787046 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-54c5f5bc8-jsm76" Mar 18 09:20:36 crc kubenswrapper[4778]: I0318 09:20:36.827935 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sgs49" Mar 18 09:20:41 crc kubenswrapper[4778]: I0318 09:20:41.730165 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-64c4x" Mar 18 09:20:42 crc kubenswrapper[4778]: I0318 09:20:42.136814 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-xdgmv" Mar 18 09:20:43 crc kubenswrapper[4778]: I0318 09:20:43.483988 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" event={"ID":"e245908e-e35e-403c-93f6-48371904ae42","Type":"ContainerStarted","Data":"288761a7e4ffe40fe1649b5ee737a2e3be3b06ba9708ec90175fbe22bdcb1bb7"} Mar 18 09:20:43 crc kubenswrapper[4778]: I0318 09:20:43.484786 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:20:43 crc kubenswrapper[4778]: I0318 09:20:43.521018 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" podStartSLOduration=4.403711626 podStartE2EDuration="38.520976609s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.309668434 +0000 UTC m=+1074.884413274" lastFinishedPulling="2026-03-18 09:20:42.426933387 +0000 UTC m=+1109.001678257" observedRunningTime="2026-03-18 09:20:43.518886832 +0000 UTC m=+1110.093631692" watchObservedRunningTime="2026-03-18 09:20:43.520976609 +0000 UTC m=+1110.095721489" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.514716 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" event={"ID":"b837636e-8c09-42b7-9a81-e7875df68344","Type":"ContainerStarted","Data":"c7761531b0126c6fab9efbb77ef73a19be290edf5217e7c0567d44c31a1ec345"} Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.518572 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" event={"ID":"3c86f76c-1617-45e9-9573-f6fd51803b45","Type":"ContainerStarted","Data":"3588c7cb07a2b0d12d1d41bfb28021419fd61be8c63ed51a50e5827b5369e309"} Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.519622 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.521805 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" event={"ID":"ae690990-eeb1-4871-8c51-dd3b547e1193","Type":"ContainerStarted","Data":"884302ecd2a0d3fb98a1f175a3d1326060ba1dd8333398b9adbdd0e566bd3d2d"} Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.522013 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.524702 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" event={"ID":"e1ec7bae-8e15-4844-84d2-ff5951d0be31","Type":"ContainerStarted","Data":"06333c67b257e873769a57d0d5c9681a935d00536f9890a79c7f46156b018f50"} Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.524997 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.548824 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jrv8" podStartSLOduration=2.736765689 podStartE2EDuration="39.548794492s" podCreationTimestamp="2026-03-18 09:20:06 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.078574385 +0000 UTC m=+1074.653319235" lastFinishedPulling="2026-03-18 09:20:44.890603198 +0000 UTC m=+1111.465348038" observedRunningTime="2026-03-18 09:20:45.534822231 +0000 UTC m=+1112.109567151" watchObservedRunningTime="2026-03-18 09:20:45.548794492 +0000 UTC m=+1112.123539362" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.561908 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" podStartSLOduration=4.189084211 podStartE2EDuration="40.561878978s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:08.010809312 +0000 UTC m=+1074.585554152" lastFinishedPulling="2026-03-18 09:20:44.383604079 +0000 UTC m=+1110.958348919" observedRunningTime="2026-03-18 09:20:45.558385083 +0000 UTC m=+1112.133129993" watchObservedRunningTime="2026-03-18 09:20:45.561878978 +0000 UTC m=+1112.136623818" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.578389 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" podStartSLOduration=3.3602219939999998 podStartE2EDuration="40.578356818s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.163564783 +0000 UTC m=+1073.738309623" lastFinishedPulling="2026-03-18 09:20:44.381699597 +0000 UTC m=+1110.956444447" observedRunningTime="2026-03-18 09:20:45.575095709 +0000 UTC m=+1112.149840559" watchObservedRunningTime="2026-03-18 09:20:45.578356818 +0000 UTC m=+1112.153101698" Mar 18 09:20:45 crc kubenswrapper[4778]: I0318 09:20:45.599147 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" podStartSLOduration=3.851720334 podStartE2EDuration="40.599116745s" podCreationTimestamp="2026-03-18 09:20:05 +0000 UTC" firstStartedPulling="2026-03-18 09:20:07.637001809 +0000 UTC m=+1074.211746649" lastFinishedPulling="2026-03-18 09:20:44.38439819 +0000 UTC m=+1110.959143060" observedRunningTime="2026-03-18 09:20:45.58982054 +0000 UTC m=+1112.164565440" watchObservedRunningTime="2026-03-18 09:20:45.599116745 +0000 UTC m=+1112.173861625" Mar 18 09:20:55 crc kubenswrapper[4778]: I0318 09:20:55.843329 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fjjvl" Mar 18 09:20:55 crc kubenswrapper[4778]: I0318 09:20:55.872647 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-5xvtc" Mar 18 09:20:56 crc kubenswrapper[4778]: I0318 09:20:56.026046 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k4r2p" Mar 18 09:20:56 crc kubenswrapper[4778]: I0318 09:20:56.460820 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-h6whs" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.366060 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.373513 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.375802 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7szsm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.376240 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.376654 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.376902 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.393337 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.460728 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.461830 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.467537 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.502587 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.517803 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.518127 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.518371 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n4kw\" (UniqueName: \"kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.518540 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.518695 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvcvh\" (UniqueName: \"kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.620622 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.620956 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.621100 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n4kw\" (UniqueName: \"kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.621374 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.622383 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.622365 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.621941 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.622640 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvcvh\" (UniqueName: \"kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.657457 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n4kw\" (UniqueName: \"kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw\") pod \"dnsmasq-dns-675f4bcbfc-csxwm\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.658968 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvcvh\" (UniqueName: \"kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh\") pod \"dnsmasq-dns-78dd6ddcc-wbpnt\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.693318 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:15 crc kubenswrapper[4778]: I0318 09:21:15.801219 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:16 crc kubenswrapper[4778]: I0318 09:21:16.023272 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:16 crc kubenswrapper[4778]: W0318 09:21:16.028590 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef690ca0_3568_4334_bddc_956b11424d40.slice/crio-56d81e5e06ecb4723728e490b7892ca9d5f51a511bc3931f818bfe5aaf2487d9 WatchSource:0}: Error finding container 56d81e5e06ecb4723728e490b7892ca9d5f51a511bc3931f818bfe5aaf2487d9: Status 404 returned error can't find the container with id 56d81e5e06ecb4723728e490b7892ca9d5f51a511bc3931f818bfe5aaf2487d9 Mar 18 09:21:16 crc kubenswrapper[4778]: I0318 09:21:16.131419 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:16 crc kubenswrapper[4778]: W0318 09:21:16.136161 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c0c899c_e724_4486_bfba_42c7f089cfa7.slice/crio-4c2bd12d0d178885863f2b67ec45e234d90e570362b4cc7fec292b577d9c4441 WatchSource:0}: Error finding container 4c2bd12d0d178885863f2b67ec45e234d90e570362b4cc7fec292b577d9c4441: Status 404 returned error can't find the container with id 4c2bd12d0d178885863f2b67ec45e234d90e570362b4cc7fec292b577d9c4441 Mar 18 09:21:16 crc kubenswrapper[4778]: I0318 09:21:16.808265 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" event={"ID":"ef690ca0-3568-4334-bddc-956b11424d40","Type":"ContainerStarted","Data":"56d81e5e06ecb4723728e490b7892ca9d5f51a511bc3931f818bfe5aaf2487d9"} Mar 18 09:21:16 crc kubenswrapper[4778]: I0318 09:21:16.810857 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" event={"ID":"4c0c899c-e724-4486-bfba-42c7f089cfa7","Type":"ContainerStarted","Data":"4c2bd12d0d178885863f2b67ec45e234d90e570362b4cc7fec292b577d9c4441"} Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.211997 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.244446 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.245950 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.255333 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.375112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.375175 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.375248 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7prc\" (UniqueName: \"kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.477283 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7prc\" (UniqueName: \"kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.477412 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.477438 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.478432 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.478742 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.519018 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7prc\" (UniqueName: \"kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc\") pod \"dnsmasq-dns-5ccc8479f9-ffznk\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.539098 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.560737 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.566284 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.605221 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.631744 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.709531 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.709582 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.709854 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chwdv\" (UniqueName: \"kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.811520 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chwdv\" (UniqueName: \"kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.811584 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.811612 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.812795 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.813535 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.838391 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chwdv\" (UniqueName: \"kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv\") pod \"dnsmasq-dns-57d769cc4f-xvr5c\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:18 crc kubenswrapper[4778]: I0318 09:21:18.999389 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.170670 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.419019 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.420393 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426066 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426085 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426219 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426416 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426490 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426599 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.426732 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7f9jg" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.436147 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.521977 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.522626 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.522692 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.522970 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523131 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523175 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523219 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523280 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhk6k\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523426 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523506 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.523546 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.625880 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.625966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626011 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626032 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626047 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626064 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhk6k\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626089 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626105 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626127 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626154 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.626173 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.627052 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.627343 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.627904 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.628146 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.629098 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.629590 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.635796 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.637545 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.640763 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.656043 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.662925 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhk6k\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.668418 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.737164 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.739508 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.743564 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.743616 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.743564 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.743786 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.744474 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.744587 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.744698 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-b2npt" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.755469 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.759560 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830254 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830322 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trc2k\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830352 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830394 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830424 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830714 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830746 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830784 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830803 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.830898 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.932962 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933046 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933081 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933105 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933248 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933332 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933391 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933427 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trc2k\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933455 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933485 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.933526 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.934001 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.934733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.935320 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.935649 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.935798 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.936887 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.939233 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.940730 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.941666 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.950932 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.952689 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trc2k\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:19 crc kubenswrapper[4778]: I0318 09:21:19.964276 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " pod="openstack/rabbitmq-server-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.077492 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.731383 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.736111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.739182 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.739949 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.740093 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-btksr" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.740814 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.753386 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.769016 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.855969 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856022 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856062 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856100 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856133 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856166 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856311 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsgm9\" (UniqueName: \"kubernetes.io/projected/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kube-api-access-lsgm9\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.856375 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.957901 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsgm9\" (UniqueName: \"kubernetes.io/projected/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kube-api-access-lsgm9\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958062 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958090 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958118 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958151 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958188 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958409 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958619 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.958771 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.959621 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.960067 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.979952 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.980655 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.984297 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsgm9\" (UniqueName: \"kubernetes.io/projected/cfadc08e-9e77-4b6f-be89-fc7c726e85b7-kube-api-access-lsgm9\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:20 crc kubenswrapper[4778]: I0318 09:21:20.999764 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cfadc08e-9e77-4b6f-be89-fc7c726e85b7\") " pod="openstack/openstack-galera-0" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.080145 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.985269 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.990702 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.994096 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-trpxt" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.994154 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.995434 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.996451 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 09:21:21 crc kubenswrapper[4778]: I0318 09:21:21.997149 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.074720 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvg6l\" (UniqueName: \"kubernetes.io/projected/49ce9560-3ee2-48d2-b016-a9feefb3a798-kube-api-access-jvg6l\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.074798 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.074849 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.074915 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.074957 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.075006 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.075043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.075159 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.176821 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.176866 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.176891 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.176930 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvg6l\" (UniqueName: \"kubernetes.io/projected/49ce9560-3ee2-48d2-b016-a9feefb3a798-kube-api-access-jvg6l\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177041 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177062 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177113 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177700 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.177992 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.178519 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.184920 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.193255 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49ce9560-3ee2-48d2-b016-a9feefb3a798-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.206162 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.225923 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ce9560-3ee2-48d2-b016-a9feefb3a798-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.237963 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvg6l\" (UniqueName: \"kubernetes.io/projected/49ce9560-3ee2-48d2-b016-a9feefb3a798-kube-api-access-jvg6l\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.253451 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"49ce9560-3ee2-48d2-b016-a9feefb3a798\") " pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.351306 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.402877 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.405524 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.407774 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-6dj2t" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.414455 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.414711 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.415587 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.482377 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.482432 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-config-data\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.482484 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.482577 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrhhv\" (UniqueName: \"kubernetes.io/projected/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kube-api-access-hrhhv\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.482854 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kolla-config\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.584780 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.584826 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-config-data\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.584855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.584909 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrhhv\" (UniqueName: \"kubernetes.io/projected/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kube-api-access-hrhhv\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.584948 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kolla-config\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.585927 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-config-data\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.586301 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kolla-config\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.592716 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.592873 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc50d224-cd65-4a46-b3d0-b40acdbda53d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.610785 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrhhv\" (UniqueName: \"kubernetes.io/projected/fc50d224-cd65-4a46-b3d0-b40acdbda53d-kube-api-access-hrhhv\") pod \"memcached-0\" (UID: \"fc50d224-cd65-4a46-b3d0-b40acdbda53d\") " pod="openstack/memcached-0" Mar 18 09:21:22 crc kubenswrapper[4778]: I0318 09:21:22.725331 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 09:21:23 crc kubenswrapper[4778]: I0318 09:21:23.336944 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:21:23 crc kubenswrapper[4778]: I0318 09:21:23.890000 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" event={"ID":"b849baae-7043-48dd-be08-0edde88c7c69","Type":"ContainerStarted","Data":"32885a5d9fd1876608960a09778b3825b17dc07db2a1dad6becf112b8746266b"} Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.524687 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.525858 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.528150 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vmscp" Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.548564 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.633125 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xklgw\" (UniqueName: \"kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw\") pod \"kube-state-metrics-0\" (UID: \"45babbce-b5d2-4ad5-8bc2-a5047e777e8d\") " pod="openstack/kube-state-metrics-0" Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.735415 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xklgw\" (UniqueName: \"kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw\") pod \"kube-state-metrics-0\" (UID: \"45babbce-b5d2-4ad5-8bc2-a5047e777e8d\") " pod="openstack/kube-state-metrics-0" Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.764303 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xklgw\" (UniqueName: \"kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw\") pod \"kube-state-metrics-0\" (UID: \"45babbce-b5d2-4ad5-8bc2-a5047e777e8d\") " pod="openstack/kube-state-metrics-0" Mar 18 09:21:24 crc kubenswrapper[4778]: I0318 09:21:24.846095 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:21:26 crc kubenswrapper[4778]: W0318 09:21:26.123663 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod124c0069_debd_459c_9d66_f38d9d096996.slice/crio-81379cb4a6ab1c31c5025b2cadc823fb3d4cb38e6f779d5cd8007da3c790b69a WatchSource:0}: Error finding container 81379cb4a6ab1c31c5025b2cadc823fb3d4cb38e6f779d5cd8007da3c790b69a: Status 404 returned error can't find the container with id 81379cb4a6ab1c31c5025b2cadc823fb3d4cb38e6f779d5cd8007da3c790b69a Mar 18 09:21:26 crc kubenswrapper[4778]: I0318 09:21:26.592397 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 09:21:26 crc kubenswrapper[4778]: I0318 09:21:26.921314 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" event={"ID":"124c0069-debd-459c-9d66-f38d9d096996","Type":"ContainerStarted","Data":"81379cb4a6ab1c31c5025b2cadc823fb3d4cb38e6f779d5cd8007da3c790b69a"} Mar 18 09:21:27 crc kubenswrapper[4778]: I0318 09:21:27.961150 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-djmq6"] Mar 18 09:21:27 crc kubenswrapper[4778]: I0318 09:21:27.986854 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6" Mar 18 09:21:27 crc kubenswrapper[4778]: I0318 09:21:27.997997 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-2qjkd" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.001141 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.001413 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.008685 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djmq6"] Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.041512 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-zrlnv"] Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.043167 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.051304 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zrlnv"] Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.104987 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trqpk\" (UniqueName: \"kubernetes.io/projected/f58533cf-4c57-4c3a-b772-e2a488298d7e-kube-api-access-trqpk\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105058 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89cb0c73-439f-4178-bd96-f50b123bcd8a-scripts\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105082 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw5lj\" (UniqueName: \"kubernetes.io/projected/89cb0c73-439f-4178-bd96-f50b123bcd8a-kube-api-access-lw5lj\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-log\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105144 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-ovn-controller-tls-certs\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105161 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105223 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105333 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-lib\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105375 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-log-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105482 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58533cf-4c57-4c3a-b772-e2a488298d7e-scripts\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105521 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-combined-ca-bundle\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105580 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-etc-ovs\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.105699 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-run\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.206835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-run\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.206892 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trqpk\" (UniqueName: \"kubernetes.io/projected/f58533cf-4c57-4c3a-b772-e2a488298d7e-kube-api-access-trqpk\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.206940 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89cb0c73-439f-4178-bd96-f50b123bcd8a-scripts\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.206959 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw5lj\" (UniqueName: \"kubernetes.io/projected/89cb0c73-439f-4178-bd96-f50b123bcd8a-kube-api-access-lw5lj\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.206979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-log\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207025 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-ovn-controller-tls-certs\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207042 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207065 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207092 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-lib\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207110 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-log-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207135 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58533cf-4c57-4c3a-b772-e2a488298d7e-scripts\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207153 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-combined-ca-bundle\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207175 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-etc-ovs\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.207990 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-etc-ovs\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.208141 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-log-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.208344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.208460 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-lib\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.211102 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-run\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.211147 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58533cf-4c57-4c3a-b772-e2a488298d7e-scripts\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.211359 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/89cb0c73-439f-4178-bd96-f50b123bcd8a-var-log\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.211959 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f58533cf-4c57-4c3a-b772-e2a488298d7e-var-run-ovn\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.217098 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-combined-ca-bundle\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.217848 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89cb0c73-439f-4178-bd96-f50b123bcd8a-scripts\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.218265 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f58533cf-4c57-4c3a-b772-e2a488298d7e-ovn-controller-tls-certs\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.226967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw5lj\" (UniqueName: \"kubernetes.io/projected/89cb0c73-439f-4178-bd96-f50b123bcd8a-kube-api-access-lw5lj\") pod \"ovn-controller-ovs-zrlnv\" (UID: \"89cb0c73-439f-4178-bd96-f50b123bcd8a\") " pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.227365 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trqpk\" (UniqueName: \"kubernetes.io/projected/f58533cf-4c57-4c3a-b772-e2a488298d7e-kube-api-access-trqpk\") pod \"ovn-controller-djmq6\" (UID: \"f58533cf-4c57-4c3a-b772-e2a488298d7e\") " pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.334575 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.365332 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.730839 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.732083 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.734671 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-lxb7s" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.735023 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.735218 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.735228 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.735563 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.767773 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817277 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqr9h\" (UniqueName: \"kubernetes.io/projected/495e34ad-2f4d-46de-95e9-37b34a35f2d2-kube-api-access-cqr9h\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817352 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817381 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817405 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-config\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817420 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817467 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817492 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.817515 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918577 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918630 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918654 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-config\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918709 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918766 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.918805 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqr9h\" (UniqueName: \"kubernetes.io/projected/495e34ad-2f4d-46de-95e9-37b34a35f2d2-kube-api-access-cqr9h\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.919012 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.919470 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.919589 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-config\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.920490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/495e34ad-2f4d-46de-95e9-37b34a35f2d2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.924512 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.925364 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.925672 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495e34ad-2f4d-46de-95e9-37b34a35f2d2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.952165 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqr9h\" (UniqueName: \"kubernetes.io/projected/495e34ad-2f4d-46de-95e9-37b34a35f2d2-kube-api-access-cqr9h\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:28 crc kubenswrapper[4778]: I0318 09:21:28.984694 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"495e34ad-2f4d-46de-95e9-37b34a35f2d2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:29 crc kubenswrapper[4778]: I0318 09:21:29.112815 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: W0318 09:21:31.420178 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49ce9560_3ee2_48d2_b016_a9feefb3a798.slice/crio-7e2a46cb51b397d310dc7e376a9a7d14cd4c698d1c5c6543d19669ba836b7c87 WatchSource:0}: Error finding container 7e2a46cb51b397d310dc7e376a9a7d14cd4c698d1c5c6543d19669ba836b7c87: Status 404 returned error can't find the container with id 7e2a46cb51b397d310dc7e376a9a7d14cd4c698d1c5c6543d19669ba836b7c87 Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.710682 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.712856 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.726825 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-blthw" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.727061 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.727184 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.727741 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.768633 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.836634 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887472 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887630 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887660 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h4ps\" (UniqueName: \"kubernetes.io/projected/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-kube-api-access-7h4ps\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887694 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887721 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887753 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.887774 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.962923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"49ce9560-3ee2-48d2-b016-a9feefb3a798","Type":"ContainerStarted","Data":"7e2a46cb51b397d310dc7e376a9a7d14cd4c698d1c5c6543d19669ba836b7c87"} Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989789 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989814 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h4ps\" (UniqueName: \"kubernetes.io/projected/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-kube-api-access-7h4ps\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989843 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989876 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989912 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.989937 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.990606 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.990687 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.991450 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.991928 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.999130 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.999188 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:31 crc kubenswrapper[4778]: I0318 09:21:31.999458 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:32 crc kubenswrapper[4778]: I0318 09:21:32.010565 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:32 crc kubenswrapper[4778]: I0318 09:21:32.014763 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h4ps\" (UniqueName: \"kubernetes.io/projected/113a3fc7-40a1-46f9-b93f-01a34fcaf4aa-kube-api-access-7h4ps\") pod \"ovsdbserver-nb-0\" (UID: \"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa\") " pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:32 crc kubenswrapper[4778]: I0318 09:21:32.055957 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:34 crc kubenswrapper[4778]: W0318 09:21:34.907023 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfadc08e_9e77_4b6f_be89_fc7c726e85b7.slice/crio-fb0fc7f95277ccc99d0867676ec0e6cfd5c0179cd8b716a74e849e369777a77e WatchSource:0}: Error finding container fb0fc7f95277ccc99d0867676ec0e6cfd5c0179cd8b716a74e849e369777a77e: Status 404 returned error can't find the container with id fb0fc7f95277ccc99d0867676ec0e6cfd5c0179cd8b716a74e849e369777a77e Mar 18 09:21:34 crc kubenswrapper[4778]: E0318 09:21:34.943825 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 09:21:34 crc kubenswrapper[4778]: E0318 09:21:34.944142 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5n4kw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-csxwm_openstack(4c0c899c-e724-4486-bfba-42c7f089cfa7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:21:34 crc kubenswrapper[4778]: E0318 09:21:34.945335 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" podUID="4c0c899c-e724-4486-bfba-42c7f089cfa7" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.017077 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cfadc08e-9e77-4b6f-be89-fc7c726e85b7","Type":"ContainerStarted","Data":"fb0fc7f95277ccc99d0867676ec0e6cfd5c0179cd8b716a74e849e369777a77e"} Mar 18 09:21:35 crc kubenswrapper[4778]: E0318 09:21:35.017373 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 09:21:35 crc kubenswrapper[4778]: E0318 09:21:35.017651 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fvcvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-wbpnt_openstack(ef690ca0-3568-4334-bddc-956b11424d40): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:21:35 crc kubenswrapper[4778]: E0318 09:21:35.020510 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" podUID="ef690ca0-3568-4334-bddc-956b11424d40" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.481520 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.482376 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:21:35 crc kubenswrapper[4778]: W0318 09:21:35.488997 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671fe1be_f3dd_475e_8c48_a1d1db510aef.slice/crio-fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23 WatchSource:0}: Error finding container fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23: Status 404 returned error can't find the container with id fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23 Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.654165 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n4kw\" (UniqueName: \"kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw\") pod \"4c0c899c-e724-4486-bfba-42c7f089cfa7\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.655234 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config\") pod \"4c0c899c-e724-4486-bfba-42c7f089cfa7\" (UID: \"4c0c899c-e724-4486-bfba-42c7f089cfa7\") " Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.655846 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config" (OuterVolumeSpecName: "config") pod "4c0c899c-e724-4486-bfba-42c7f089cfa7" (UID: "4c0c899c-e724-4486-bfba-42c7f089cfa7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.660424 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw" (OuterVolumeSpecName: "kube-api-access-5n4kw") pod "4c0c899c-e724-4486-bfba-42c7f089cfa7" (UID: "4c0c899c-e724-4486-bfba-42c7f089cfa7"). InnerVolumeSpecName "kube-api-access-5n4kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.683849 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.694099 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 09:21:35 crc kubenswrapper[4778]: W0318 09:21:35.707444 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc50d224_cd65_4a46_b3d0_b40acdbda53d.slice/crio-4563c9b3379100c02eff6ea9b1139f424a6cb1b624eea7483fbfdd047b8da04e WatchSource:0}: Error finding container 4563c9b3379100c02eff6ea9b1139f424a6cb1b624eea7483fbfdd047b8da04e: Status 404 returned error can't find the container with id 4563c9b3379100c02eff6ea9b1139f424a6cb1b624eea7483fbfdd047b8da04e Mar 18 09:21:35 crc kubenswrapper[4778]: W0318 09:21:35.712845 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57955df9_f0c5_4cfc_91fd_135771be7ed2.slice/crio-9908701ec8146f186284a44b30a5e2c02918471c220798a76b04c8de271c7850 WatchSource:0}: Error finding container 9908701ec8146f186284a44b30a5e2c02918471c220798a76b04c8de271c7850: Status 404 returned error can't find the container with id 9908701ec8146f186284a44b30a5e2c02918471c220798a76b04c8de271c7850 Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.760815 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n4kw\" (UniqueName: \"kubernetes.io/projected/4c0c899c-e724-4486-bfba-42c7f089cfa7-kube-api-access-5n4kw\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.760882 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0c899c-e724-4486-bfba-42c7f089cfa7-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.795268 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.831411 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djmq6"] Mar 18 09:21:35 crc kubenswrapper[4778]: W0318 09:21:35.833404 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf58533cf_4c57_4c3a_b772_e2a488298d7e.slice/crio-d2ceca23ab2c5fec44125091c66355f512309c6254da6f49d10b25b4e90b4077 WatchSource:0}: Error finding container d2ceca23ab2c5fec44125091c66355f512309c6254da6f49d10b25b4e90b4077: Status 404 returned error can't find the container with id d2ceca23ab2c5fec44125091c66355f512309c6254da6f49d10b25b4e90b4077 Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.835111 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:21:35 crc kubenswrapper[4778]: W0318 09:21:35.845098 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45babbce_b5d2_4ad5_8bc2_a5047e777e8d.slice/crio-aea881ea048223a1a79ed7ce2d76ae73c7092387d05c9eaeaf63d09ce8b8e125 WatchSource:0}: Error finding container aea881ea048223a1a79ed7ce2d76ae73c7092387d05c9eaeaf63d09ce8b8e125: Status 404 returned error can't find the container with id aea881ea048223a1a79ed7ce2d76ae73c7092387d05c9eaeaf63d09ce8b8e125 Mar 18 09:21:35 crc kubenswrapper[4778]: I0318 09:21:35.889536 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.026225 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" event={"ID":"4c0c899c-e724-4486-bfba-42c7f089cfa7","Type":"ContainerDied","Data":"4c2bd12d0d178885863f2b67ec45e234d90e570362b4cc7fec292b577d9c4441"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.026321 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-csxwm" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.028901 4778 generic.go:334] "Generic (PLEG): container finished" podID="b849baae-7043-48dd-be08-0edde88c7c69" containerID="480d7c5317737b3e1aeba0fc6a1727b6f65e4c3e2f9bf9812586687ddd61c9ba" exitCode=0 Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.029264 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" event={"ID":"b849baae-7043-48dd-be08-0edde88c7c69","Type":"ContainerDied","Data":"480d7c5317737b3e1aeba0fc6a1727b6f65e4c3e2f9bf9812586687ddd61c9ba"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.030516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerStarted","Data":"fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.036518 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"495e34ad-2f4d-46de-95e9-37b34a35f2d2","Type":"ContainerStarted","Data":"a2731392612b32d712d0c6d5c193713b53b7b3bb945d54484f35a26488fff227"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.037703 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fc50d224-cd65-4a46-b3d0-b40acdbda53d","Type":"ContainerStarted","Data":"4563c9b3379100c02eff6ea9b1139f424a6cb1b624eea7483fbfdd047b8da04e"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.041324 4778 generic.go:334] "Generic (PLEG): container finished" podID="124c0069-debd-459c-9d66-f38d9d096996" containerID="5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c" exitCode=0 Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.041450 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" event={"ID":"124c0069-debd-459c-9d66-f38d9d096996","Type":"ContainerDied","Data":"5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.046979 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6" event={"ID":"f58533cf-4c57-4c3a-b772-e2a488298d7e","Type":"ContainerStarted","Data":"d2ceca23ab2c5fec44125091c66355f512309c6254da6f49d10b25b4e90b4077"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.052792 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa","Type":"ContainerStarted","Data":"cf2129c3bfbc296aad0527dd8ce0ba04e342011cc58c1e9bbaefda0cf47445b6"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.067819 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerStarted","Data":"9908701ec8146f186284a44b30a5e2c02918471c220798a76b04c8de271c7850"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.084560 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45babbce-b5d2-4ad5-8bc2-a5047e777e8d","Type":"ContainerStarted","Data":"aea881ea048223a1a79ed7ce2d76ae73c7092387d05c9eaeaf63d09ce8b8e125"} Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.246947 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.258862 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-csxwm"] Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.545939 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.675646 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvcvh\" (UniqueName: \"kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh\") pod \"ef690ca0-3568-4334-bddc-956b11424d40\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.675717 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config\") pod \"ef690ca0-3568-4334-bddc-956b11424d40\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.675797 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc\") pod \"ef690ca0-3568-4334-bddc-956b11424d40\" (UID: \"ef690ca0-3568-4334-bddc-956b11424d40\") " Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.676691 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config" (OuterVolumeSpecName: "config") pod "ef690ca0-3568-4334-bddc-956b11424d40" (UID: "ef690ca0-3568-4334-bddc-956b11424d40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.677243 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef690ca0-3568-4334-bddc-956b11424d40" (UID: "ef690ca0-3568-4334-bddc-956b11424d40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.686365 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh" (OuterVolumeSpecName: "kube-api-access-fvcvh") pod "ef690ca0-3568-4334-bddc-956b11424d40" (UID: "ef690ca0-3568-4334-bddc-956b11424d40"). InnerVolumeSpecName "kube-api-access-fvcvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.750292 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zrlnv"] Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.777165 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvcvh\" (UniqueName: \"kubernetes.io/projected/ef690ca0-3568-4334-bddc-956b11424d40-kube-api-access-fvcvh\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.777231 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:36 crc kubenswrapper[4778]: I0318 09:21:36.777246 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef690ca0-3568-4334-bddc-956b11424d40-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.095908 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zrlnv" event={"ID":"89cb0c73-439f-4178-bd96-f50b123bcd8a","Type":"ContainerStarted","Data":"3253b8eecb6c9436abd551deb3597dc9dea83fbfeb9a51958d9c12c740b797f5"} Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.099031 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" event={"ID":"ef690ca0-3568-4334-bddc-956b11424d40","Type":"ContainerDied","Data":"56d81e5e06ecb4723728e490b7892ca9d5f51a511bc3931f818bfe5aaf2487d9"} Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.099168 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wbpnt" Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.108482 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" event={"ID":"124c0069-debd-459c-9d66-f38d9d096996","Type":"ContainerStarted","Data":"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e"} Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.108997 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.112647 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" event={"ID":"b849baae-7043-48dd-be08-0edde88c7c69","Type":"ContainerStarted","Data":"f8db45d45042eecaafb866cb991b1dc5586cee69f57e924f2a599715ace70e35"} Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.113072 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.130168 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" podStartSLOduration=10.116715644 podStartE2EDuration="19.130149514s" podCreationTimestamp="2026-03-18 09:21:18 +0000 UTC" firstStartedPulling="2026-03-18 09:21:26.126346793 +0000 UTC m=+1152.701091633" lastFinishedPulling="2026-03-18 09:21:35.139780653 +0000 UTC m=+1161.714525503" observedRunningTime="2026-03-18 09:21:37.129853056 +0000 UTC m=+1163.704597906" watchObservedRunningTime="2026-03-18 09:21:37.130149514 +0000 UTC m=+1163.704894344" Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.170245 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.176628 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wbpnt"] Mar 18 09:21:37 crc kubenswrapper[4778]: I0318 09:21:37.183748 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" podStartSLOduration=6.9876135040000005 podStartE2EDuration="19.183732747s" podCreationTimestamp="2026-03-18 09:21:18 +0000 UTC" firstStartedPulling="2026-03-18 09:21:22.924781775 +0000 UTC m=+1149.499526615" lastFinishedPulling="2026-03-18 09:21:35.120901018 +0000 UTC m=+1161.695645858" observedRunningTime="2026-03-18 09:21:37.179786548 +0000 UTC m=+1163.754531408" watchObservedRunningTime="2026-03-18 09:21:37.183732747 +0000 UTC m=+1163.758477577" Mar 18 09:21:38 crc kubenswrapper[4778]: I0318 09:21:38.199132 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0c899c-e724-4486-bfba-42c7f089cfa7" path="/var/lib/kubelet/pods/4c0c899c-e724-4486-bfba-42c7f089cfa7/volumes" Mar 18 09:21:38 crc kubenswrapper[4778]: I0318 09:21:38.199808 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef690ca0-3568-4334-bddc-956b11424d40" path="/var/lib/kubelet/pods/ef690ca0-3568-4334-bddc-956b11424d40/volumes" Mar 18 09:21:43 crc kubenswrapper[4778]: I0318 09:21:43.610048 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:44 crc kubenswrapper[4778]: I0318 09:21:44.002466 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:21:44 crc kubenswrapper[4778]: I0318 09:21:44.077940 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:44 crc kubenswrapper[4778]: I0318 09:21:44.170303 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="dnsmasq-dns" containerID="cri-o://f8db45d45042eecaafb866cb991b1dc5586cee69f57e924f2a599715ace70e35" gracePeriod=10 Mar 18 09:21:45 crc kubenswrapper[4778]: I0318 09:21:45.182071 4778 generic.go:334] "Generic (PLEG): container finished" podID="b849baae-7043-48dd-be08-0edde88c7c69" containerID="f8db45d45042eecaafb866cb991b1dc5586cee69f57e924f2a599715ace70e35" exitCode=0 Mar 18 09:21:45 crc kubenswrapper[4778]: I0318 09:21:45.182163 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" event={"ID":"b849baae-7043-48dd-be08-0edde88c7c69","Type":"ContainerDied","Data":"f8db45d45042eecaafb866cb991b1dc5586cee69f57e924f2a599715ace70e35"} Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.708136 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.888242 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc\") pod \"b849baae-7043-48dd-be08-0edde88c7c69\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.888365 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7prc\" (UniqueName: \"kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc\") pod \"b849baae-7043-48dd-be08-0edde88c7c69\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.888425 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config\") pod \"b849baae-7043-48dd-be08-0edde88c7c69\" (UID: \"b849baae-7043-48dd-be08-0edde88c7c69\") " Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.893026 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc" (OuterVolumeSpecName: "kube-api-access-v7prc") pod "b849baae-7043-48dd-be08-0edde88c7c69" (UID: "b849baae-7043-48dd-be08-0edde88c7c69"). InnerVolumeSpecName "kube-api-access-v7prc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.922509 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b849baae-7043-48dd-be08-0edde88c7c69" (UID: "b849baae-7043-48dd-be08-0edde88c7c69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.926399 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config" (OuterVolumeSpecName: "config") pod "b849baae-7043-48dd-be08-0edde88c7c69" (UID: "b849baae-7043-48dd-be08-0edde88c7c69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.990499 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7prc\" (UniqueName: \"kubernetes.io/projected/b849baae-7043-48dd-be08-0edde88c7c69-kube-api-access-v7prc\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.990559 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:46 crc kubenswrapper[4778]: I0318 09:21:46.990575 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b849baae-7043-48dd-be08-0edde88c7c69-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.202850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" event={"ID":"b849baae-7043-48dd-be08-0edde88c7c69","Type":"ContainerDied","Data":"32885a5d9fd1876608960a09778b3825b17dc07db2a1dad6becf112b8746266b"} Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.202898 4778 scope.go:117] "RemoveContainer" containerID="f8db45d45042eecaafb866cb991b1dc5586cee69f57e924f2a599715ace70e35" Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.203030 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ffznk" Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.365661 4778 scope.go:117] "RemoveContainer" containerID="480d7c5317737b3e1aeba0fc6a1727b6f65e4c3e2f9bf9812586687ddd61c9ba" Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.450376 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:47 crc kubenswrapper[4778]: I0318 09:21:47.462133 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ffznk"] Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.198511 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b849baae-7043-48dd-be08-0edde88c7c69" path="/var/lib/kubelet/pods/b849baae-7043-48dd-be08-0edde88c7c69/volumes" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.216568 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"49ce9560-3ee2-48d2-b016-a9feefb3a798","Type":"ContainerStarted","Data":"e466796f0e2910694ab8337ebde4ff7c0ee96e7134f39a9f2c45f915fab1baea"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.219078 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45babbce-b5d2-4ad5-8bc2-a5047e777e8d","Type":"ContainerStarted","Data":"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.219279 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.220704 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"495e34ad-2f4d-46de-95e9-37b34a35f2d2","Type":"ContainerStarted","Data":"22aaa7f3cb8f55c070d8fd5d609fc29b14dabd44a738ffe416442d903d142eb9"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.222278 4778 generic.go:334] "Generic (PLEG): container finished" podID="89cb0c73-439f-4178-bd96-f50b123bcd8a" containerID="83d3eeb91dd4b140efab29c838ac8560e0261dbb0dfbb67f93c2d05a75aff55f" exitCode=0 Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.222430 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zrlnv" event={"ID":"89cb0c73-439f-4178-bd96-f50b123bcd8a","Type":"ContainerDied","Data":"83d3eeb91dd4b140efab29c838ac8560e0261dbb0dfbb67f93c2d05a75aff55f"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.226458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6" event={"ID":"f58533cf-4c57-4c3a-b772-e2a488298d7e","Type":"ContainerStarted","Data":"916a2309a194229e2e394cdbba50febc7592c5dcb8bd37a5590e8a134958e250"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.227090 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-djmq6" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.235026 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa","Type":"ContainerStarted","Data":"cfb1c6cc4c019330e955211c68e17b78588bcdd17ac1ca2f93074744bc3c1bfb"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.236746 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cfadc08e-9e77-4b6f-be89-fc7c726e85b7","Type":"ContainerStarted","Data":"9ce149bc04a6fba1a6cb5e9273cbc389efc89e1e3e6563d1e314e619d97696a8"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.241995 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fc50d224-cd65-4a46-b3d0-b40acdbda53d","Type":"ContainerStarted","Data":"e110d5dc54c472940a9a343eaf5e6004cfb943d1a192a88ab0682ef45891638a"} Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.242267 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.271974 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.909123276 podStartE2EDuration="24.271950148s" podCreationTimestamp="2026-03-18 09:21:24 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.857977853 +0000 UTC m=+1162.432722693" lastFinishedPulling="2026-03-18 09:21:47.220804725 +0000 UTC m=+1173.795549565" observedRunningTime="2026-03-18 09:21:48.262648384 +0000 UTC m=+1174.837393224" watchObservedRunningTime="2026-03-18 09:21:48.271950148 +0000 UTC m=+1174.846694988" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.289789 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-djmq6" podStartSLOduration=10.562483617 podStartE2EDuration="21.289764484s" podCreationTimestamp="2026-03-18 09:21:27 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.83695941 +0000 UTC m=+1162.411704250" lastFinishedPulling="2026-03-18 09:21:46.564240287 +0000 UTC m=+1173.138985117" observedRunningTime="2026-03-18 09:21:48.286886865 +0000 UTC m=+1174.861631715" watchObservedRunningTime="2026-03-18 09:21:48.289764484 +0000 UTC m=+1174.864509324" Mar 18 09:21:48 crc kubenswrapper[4778]: I0318 09:21:48.365757 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.909660436 podStartE2EDuration="26.365737966s" podCreationTimestamp="2026-03-18 09:21:22 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.711847007 +0000 UTC m=+1162.286591847" lastFinishedPulling="2026-03-18 09:21:46.167924537 +0000 UTC m=+1172.742669377" observedRunningTime="2026-03-18 09:21:48.362563889 +0000 UTC m=+1174.937308769" watchObservedRunningTime="2026-03-18 09:21:48.365737966 +0000 UTC m=+1174.940482806" Mar 18 09:21:49 crc kubenswrapper[4778]: I0318 09:21:49.251477 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerStarted","Data":"3698e124eba53a15e3f16dfe6346805545443e4b8ce94d12254d508326508979"} Mar 18 09:21:49 crc kubenswrapper[4778]: I0318 09:21:49.254129 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerStarted","Data":"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7"} Mar 18 09:21:49 crc kubenswrapper[4778]: I0318 09:21:49.257550 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zrlnv" event={"ID":"89cb0c73-439f-4178-bd96-f50b123bcd8a","Type":"ContainerStarted","Data":"aa61b34dff12ea16006419adc618717fc2a790c20d7f1936c9ccab067065f522"} Mar 18 09:21:49 crc kubenswrapper[4778]: I0318 09:21:49.257578 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zrlnv" event={"ID":"89cb0c73-439f-4178-bd96-f50b123bcd8a","Type":"ContainerStarted","Data":"18d6161525c304e329e2b376f46994f03d27fde45d318a3922ea711c3d1191a2"} Mar 18 09:21:49 crc kubenswrapper[4778]: I0318 09:21:49.307980 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-zrlnv" podStartSLOduration=12.858162606 podStartE2EDuration="22.307932616s" podCreationTimestamp="2026-03-18 09:21:27 +0000 UTC" firstStartedPulling="2026-03-18 09:21:36.762249409 +0000 UTC m=+1163.336994249" lastFinishedPulling="2026-03-18 09:21:46.212019409 +0000 UTC m=+1172.786764259" observedRunningTime="2026-03-18 09:21:49.307303189 +0000 UTC m=+1175.882048039" watchObservedRunningTime="2026-03-18 09:21:49.307932616 +0000 UTC m=+1175.882677456" Mar 18 09:21:50 crc kubenswrapper[4778]: I0318 09:21:50.266252 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:50 crc kubenswrapper[4778]: I0318 09:21:50.266851 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:21:51 crc kubenswrapper[4778]: I0318 09:21:51.277999 4778 generic.go:334] "Generic (PLEG): container finished" podID="49ce9560-3ee2-48d2-b016-a9feefb3a798" containerID="e466796f0e2910694ab8337ebde4ff7c0ee96e7134f39a9f2c45f915fab1baea" exitCode=0 Mar 18 09:21:51 crc kubenswrapper[4778]: I0318 09:21:51.278086 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"49ce9560-3ee2-48d2-b016-a9feefb3a798","Type":"ContainerDied","Data":"e466796f0e2910694ab8337ebde4ff7c0ee96e7134f39a9f2c45f915fab1baea"} Mar 18 09:21:51 crc kubenswrapper[4778]: I0318 09:21:51.283052 4778 generic.go:334] "Generic (PLEG): container finished" podID="cfadc08e-9e77-4b6f-be89-fc7c726e85b7" containerID="9ce149bc04a6fba1a6cb5e9273cbc389efc89e1e3e6563d1e314e619d97696a8" exitCode=0 Mar 18 09:21:51 crc kubenswrapper[4778]: I0318 09:21:51.283146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cfadc08e-9e77-4b6f-be89-fc7c726e85b7","Type":"ContainerDied","Data":"9ce149bc04a6fba1a6cb5e9273cbc389efc89e1e3e6563d1e314e619d97696a8"} Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.295892 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"49ce9560-3ee2-48d2-b016-a9feefb3a798","Type":"ContainerStarted","Data":"a00f6e5d8b705b53000302e3ddae59a971386417b12cf0ca55d037a9eebb3804"} Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.299840 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cfadc08e-9e77-4b6f-be89-fc7c726e85b7","Type":"ContainerStarted","Data":"09d96c8246f4a209ab57e5f9ba16639880458ea9735ca72008c97464471540af"} Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.302591 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"495e34ad-2f4d-46de-95e9-37b34a35f2d2","Type":"ContainerStarted","Data":"13d9ec5ff76e3f55f69c1fe037a0b107797202b066241284d892f1df1f0a163d"} Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.306887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"113a3fc7-40a1-46f9-b93f-01a34fcaf4aa","Type":"ContainerStarted","Data":"85ed145be914a5d426246153108571aab2c58751d7c73029061c8ae09d1bd58b"} Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.325497 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=17.186349948 podStartE2EDuration="32.325470646s" podCreationTimestamp="2026-03-18 09:21:20 +0000 UTC" firstStartedPulling="2026-03-18 09:21:31.425409986 +0000 UTC m=+1158.000154826" lastFinishedPulling="2026-03-18 09:21:46.564530684 +0000 UTC m=+1173.139275524" observedRunningTime="2026-03-18 09:21:52.323183554 +0000 UTC m=+1178.897928424" watchObservedRunningTime="2026-03-18 09:21:52.325470646 +0000 UTC m=+1178.900215506" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.352434 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.352508 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.365921 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.593062114 podStartE2EDuration="25.365883358s" podCreationTimestamp="2026-03-18 09:21:27 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.845878733 +0000 UTC m=+1162.420623573" lastFinishedPulling="2026-03-18 09:21:51.618699957 +0000 UTC m=+1178.193444817" observedRunningTime="2026-03-18 09:21:52.358820036 +0000 UTC m=+1178.933564936" watchObservedRunningTime="2026-03-18 09:21:52.365883358 +0000 UTC m=+1178.940628238" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.395858 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.675569015 podStartE2EDuration="22.395830405s" podCreationTimestamp="2026-03-18 09:21:30 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.898397736 +0000 UTC m=+1162.473142576" lastFinishedPulling="2026-03-18 09:21:51.618659126 +0000 UTC m=+1178.193403966" observedRunningTime="2026-03-18 09:21:52.38170728 +0000 UTC m=+1178.956452130" watchObservedRunningTime="2026-03-18 09:21:52.395830405 +0000 UTC m=+1178.970575275" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.420000 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.200039321 podStartE2EDuration="33.419975573s" podCreationTimestamp="2026-03-18 09:21:19 +0000 UTC" firstStartedPulling="2026-03-18 09:21:34.909182313 +0000 UTC m=+1161.483927193" lastFinishedPulling="2026-03-18 09:21:47.129118605 +0000 UTC m=+1173.703863445" observedRunningTime="2026-03-18 09:21:52.404742388 +0000 UTC m=+1178.979487268" watchObservedRunningTime="2026-03-18 09:21:52.419975573 +0000 UTC m=+1178.994720433" Mar 18 09:21:52 crc kubenswrapper[4778]: I0318 09:21:52.729487 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.056967 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.101514 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.113358 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.163260 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.316779 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.318088 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.363625 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.389892 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.648699 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:53 crc kubenswrapper[4778]: E0318 09:21:53.649179 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="init" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.649233 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="init" Mar 18 09:21:53 crc kubenswrapper[4778]: E0318 09:21:53.649263 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="dnsmasq-dns" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.649271 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="dnsmasq-dns" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.649420 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b849baae-7043-48dd-be08-0edde88c7c69" containerName="dnsmasq-dns" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.650225 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.652450 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.662068 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.727981 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.728034 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vtq4\" (UniqueName: \"kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.728151 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.728183 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.835246 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vtq4\" (UniqueName: \"kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.835621 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.835684 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.835785 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.836703 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.836740 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.837298 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.870807 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vtq4\" (UniqueName: \"kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4\") pod \"dnsmasq-dns-6bc7876d45-gdjvv\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.878806 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-2ldk7"] Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.879887 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.884302 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.899615 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-2ldk7"] Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936690 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-config\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936746 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovs-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936796 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqntm\" (UniqueName: \"kubernetes.io/projected/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-kube-api-access-qqntm\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936828 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936850 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-combined-ca-bundle\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.936864 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovn-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.961046 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.961649 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:53 crc kubenswrapper[4778]: I0318 09:21:53.998031 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.002447 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.006894 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.030925 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043417 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqntm\" (UniqueName: \"kubernetes.io/projected/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-kube-api-access-qqntm\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043506 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043541 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-combined-ca-bundle\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043565 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovn-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043634 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-config\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.043671 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovs-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.044074 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovs-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.044659 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-ovn-rundir\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.045265 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-config\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.052465 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.053848 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.053919 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-combined-ca-bundle\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.053925 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.063994 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.064208 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.064349 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-q6qgm" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.064456 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.078610 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqntm\" (UniqueName: \"kubernetes.io/projected/2c6e8f7b-9b48-4814-9e73-fc9833c26cc9-kube-api-access-qqntm\") pod \"ovn-controller-metrics-2ldk7\" (UID: \"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9\") " pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.097547 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146746 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146785 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-scripts\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146811 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146831 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr7b2\" (UniqueName: \"kubernetes.io/projected/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-kube-api-access-xr7b2\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146869 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146888 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqgnd\" (UniqueName: \"kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146940 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146959 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.146980 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-config\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.147012 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.147028 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.230816 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-2ldk7" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248373 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248625 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248668 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248692 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-scripts\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248721 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248747 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr7b2\" (UniqueName: \"kubernetes.io/projected/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-kube-api-access-xr7b2\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248794 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248815 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqgnd\" (UniqueName: \"kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248842 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248872 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248888 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.248913 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-config\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.249885 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-config\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.252004 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.252004 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-scripts\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.252465 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.252621 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.252997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.253058 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.253708 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.261137 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.261988 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.273635 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr7b2\" (UniqueName: \"kubernetes.io/projected/ac3419bd-88ba-4b83-bd93-ad5638bc7fd0-kube-api-access-xr7b2\") pod \"ovn-northd-0\" (UID: \"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0\") " pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.276591 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqgnd\" (UniqueName: \"kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd\") pod \"dnsmasq-dns-8554648995-jt8gb\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.435773 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.469649 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.494094 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-2ldk7"] Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.538468 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:54 crc kubenswrapper[4778]: W0318 09:21:54.570425 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe59e02d_1aa2_4a26_b261_4f837e555f2d.slice/crio-886cae58e9b849d83802ee865131972663fcca4faa247e187de8d3fbac77a4a2 WatchSource:0}: Error finding container 886cae58e9b849d83802ee865131972663fcca4faa247e187de8d3fbac77a4a2: Status 404 returned error can't find the container with id 886cae58e9b849d83802ee865131972663fcca4faa247e187de8d3fbac77a4a2 Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.854031 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.958147 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:21:54 crc kubenswrapper[4778]: W0318 09:21:54.960037 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd539dcf0_c5ce_4c6c_b367_e5c3d7dac5d5.slice/crio-95b556045a6555de0e6526548932d8f77419a56ad86f2e3afbe7000b4b9694c6 WatchSource:0}: Error finding container 95b556045a6555de0e6526548932d8f77419a56ad86f2e3afbe7000b4b9694c6: Status 404 returned error can't find the container with id 95b556045a6555de0e6526548932d8f77419a56ad86f2e3afbe7000b4b9694c6 Mar 18 09:21:54 crc kubenswrapper[4778]: I0318 09:21:54.967542 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.361664 4778 generic.go:334] "Generic (PLEG): container finished" podID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerID="67e25fe46d44f377f33f04f4df4c1bf2f96243fad2d8a6135f426851e5ac8c58" exitCode=0 Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.361714 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jt8gb" event={"ID":"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5","Type":"ContainerDied","Data":"67e25fe46d44f377f33f04f4df4c1bf2f96243fad2d8a6135f426851e5ac8c58"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.361944 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jt8gb" event={"ID":"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5","Type":"ContainerStarted","Data":"95b556045a6555de0e6526548932d8f77419a56ad86f2e3afbe7000b4b9694c6"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.363795 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-2ldk7" event={"ID":"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9","Type":"ContainerStarted","Data":"ffb886d6d76f714c8705f070e5e3d961cd18629f5b516482dd69f4f1977b285f"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.363843 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-2ldk7" event={"ID":"2c6e8f7b-9b48-4814-9e73-fc9833c26cc9","Type":"ContainerStarted","Data":"9763fefc4f9d620f8f7669863a497795208b6498d97df55c051aa149c9720814"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.365319 4778 generic.go:334] "Generic (PLEG): container finished" podID="be59e02d-1aa2-4a26-b261-4f837e555f2d" containerID="de05fcadb2a51526a0dae5a25b8dfc1f6d83df10f0d77a3201f98be36fc9ca83" exitCode=0 Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.365371 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" event={"ID":"be59e02d-1aa2-4a26-b261-4f837e555f2d","Type":"ContainerDied","Data":"de05fcadb2a51526a0dae5a25b8dfc1f6d83df10f0d77a3201f98be36fc9ca83"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.365387 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" event={"ID":"be59e02d-1aa2-4a26-b261-4f837e555f2d","Type":"ContainerStarted","Data":"886cae58e9b849d83802ee865131972663fcca4faa247e187de8d3fbac77a4a2"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.366403 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0","Type":"ContainerStarted","Data":"967590a1812cba79ddc734fdd7b654f806eaeb73d07a4f802ae6196fe173eaec"} Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.400782 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-2ldk7" podStartSLOduration=2.400764 podStartE2EDuration="2.400764s" podCreationTimestamp="2026-03-18 09:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:21:55.395383275 +0000 UTC m=+1181.970128125" watchObservedRunningTime="2026-03-18 09:21:55.400764 +0000 UTC m=+1181.975508850" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.690267 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.775841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb\") pod \"be59e02d-1aa2-4a26-b261-4f837e555f2d\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.776687 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config\") pod \"be59e02d-1aa2-4a26-b261-4f837e555f2d\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.776730 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vtq4\" (UniqueName: \"kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4\") pod \"be59e02d-1aa2-4a26-b261-4f837e555f2d\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.776845 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc\") pod \"be59e02d-1aa2-4a26-b261-4f837e555f2d\" (UID: \"be59e02d-1aa2-4a26-b261-4f837e555f2d\") " Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.782462 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4" (OuterVolumeSpecName: "kube-api-access-7vtq4") pod "be59e02d-1aa2-4a26-b261-4f837e555f2d" (UID: "be59e02d-1aa2-4a26-b261-4f837e555f2d"). InnerVolumeSpecName "kube-api-access-7vtq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.795504 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "be59e02d-1aa2-4a26-b261-4f837e555f2d" (UID: "be59e02d-1aa2-4a26-b261-4f837e555f2d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.797980 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "be59e02d-1aa2-4a26-b261-4f837e555f2d" (UID: "be59e02d-1aa2-4a26-b261-4f837e555f2d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.806274 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config" (OuterVolumeSpecName: "config") pod "be59e02d-1aa2-4a26-b261-4f837e555f2d" (UID: "be59e02d-1aa2-4a26-b261-4f837e555f2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.878923 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.878959 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.878969 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vtq4\" (UniqueName: \"kubernetes.io/projected/be59e02d-1aa2-4a26-b261-4f837e555f2d-kube-api-access-7vtq4\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:55 crc kubenswrapper[4778]: I0318 09:21:55.878983 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be59e02d-1aa2-4a26-b261-4f837e555f2d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.376140 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0","Type":"ContainerStarted","Data":"80586af0325ebbab3b0342dd081ce20efed8640a270588eda509775f80d4ba92"} Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.378661 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jt8gb" event={"ID":"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5","Type":"ContainerStarted","Data":"0cb5262a9d3d057ca53ec602e21c523692c8679333829bacc24282be5307d258"} Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.378818 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.380070 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.380107 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-gdjvv" event={"ID":"be59e02d-1aa2-4a26-b261-4f837e555f2d","Type":"ContainerDied","Data":"886cae58e9b849d83802ee865131972663fcca4faa247e187de8d3fbac77a4a2"} Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.380132 4778 scope.go:117] "RemoveContainer" containerID="de05fcadb2a51526a0dae5a25b8dfc1f6d83df10f0d77a3201f98be36fc9ca83" Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.409556 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-jt8gb" podStartSLOduration=3.409532168 podStartE2EDuration="3.409532168s" podCreationTimestamp="2026-03-18 09:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:21:56.404433078 +0000 UTC m=+1182.979177938" watchObservedRunningTime="2026-03-18 09:21:56.409532168 +0000 UTC m=+1182.984277038" Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.466315 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:56 crc kubenswrapper[4778]: I0318 09:21:56.470149 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-gdjvv"] Mar 18 09:21:57 crc kubenswrapper[4778]: I0318 09:21:57.389633 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac3419bd-88ba-4b83-bd93-ad5638bc7fd0","Type":"ContainerStarted","Data":"9cb9f8c938b08681165a7e0ffa0dd54d6c223ddeec4f61e609b3e802c5cd34cc"} Mar 18 09:21:57 crc kubenswrapper[4778]: I0318 09:21:57.389828 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 09:21:57 crc kubenswrapper[4778]: I0318 09:21:57.422618 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.289491122 podStartE2EDuration="3.42260298s" podCreationTimestamp="2026-03-18 09:21:54 +0000 UTC" firstStartedPulling="2026-03-18 09:21:54.971127442 +0000 UTC m=+1181.545872282" lastFinishedPulling="2026-03-18 09:21:56.1042393 +0000 UTC m=+1182.678984140" observedRunningTime="2026-03-18 09:21:57.418732395 +0000 UTC m=+1183.993477245" watchObservedRunningTime="2026-03-18 09:21:57.42260298 +0000 UTC m=+1183.997347820" Mar 18 09:21:58 crc kubenswrapper[4778]: I0318 09:21:58.195737 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be59e02d-1aa2-4a26-b261-4f837e555f2d" path="/var/lib/kubelet/pods/be59e02d-1aa2-4a26-b261-4f837e555f2d/volumes" Mar 18 09:21:58 crc kubenswrapper[4778]: I0318 09:21:58.488045 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 09:21:58 crc kubenswrapper[4778]: I0318 09:21:58.602992 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.141712 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563762-nk868"] Mar 18 09:22:00 crc kubenswrapper[4778]: E0318 09:22:00.142414 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be59e02d-1aa2-4a26-b261-4f837e555f2d" containerName="init" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.142430 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be59e02d-1aa2-4a26-b261-4f837e555f2d" containerName="init" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.142605 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="be59e02d-1aa2-4a26-b261-4f837e555f2d" containerName="init" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.143130 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.146532 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.146746 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.147619 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.147708 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.158737 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.160878 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-nk868"] Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.269839 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhg95\" (UniqueName: \"kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95\") pod \"auto-csr-approver-29563762-nk868\" (UID: \"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d\") " pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.372289 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhg95\" (UniqueName: \"kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95\") pod \"auto-csr-approver-29563762-nk868\" (UID: \"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d\") " pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.396734 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhg95\" (UniqueName: \"kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95\") pod \"auto-csr-approver-29563762-nk868\" (UID: \"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d\") " pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.473012 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:00 crc kubenswrapper[4778]: I0318 09:22:00.986814 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-nk868"] Mar 18 09:22:00 crc kubenswrapper[4778]: W0318 09:22:00.991105 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bcf145e_ae6a_4674_9f79_b6486ec2fa9d.slice/crio-0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163 WatchSource:0}: Error finding container 0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163: Status 404 returned error can't find the container with id 0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163 Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.050054 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-nnzs2"] Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.051559 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.054148 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.067463 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nnzs2"] Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.082381 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.082434 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.192512 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tmwp\" (UniqueName: \"kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.192895 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.197660 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.294951 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tmwp\" (UniqueName: \"kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.295042 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.296297 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.322697 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tmwp\" (UniqueName: \"kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp\") pod \"root-account-create-update-nnzs2\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.394898 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.428987 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-nk868" event={"ID":"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d","Type":"ContainerStarted","Data":"0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163"} Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.524999 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 09:22:01 crc kubenswrapper[4778]: I0318 09:22:01.879899 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nnzs2"] Mar 18 09:22:02 crc kubenswrapper[4778]: I0318 09:22:02.437862 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nnzs2" event={"ID":"ae95589d-d3fb-4254-9fd0-f59203e0e927","Type":"ContainerStarted","Data":"a2153b59b29a56867da95a8bfa79fff53d74a9d7fba4ed7aa6073d0c04ef364a"} Mar 18 09:22:02 crc kubenswrapper[4778]: I0318 09:22:02.917755 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-dz4jc"] Mar 18 09:22:02 crc kubenswrapper[4778]: I0318 09:22:02.919324 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:02 crc kubenswrapper[4778]: I0318 09:22:02.932665 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dz4jc"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.064593 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5387-account-create-update-wm5k5"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.066101 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.069569 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.073726 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5387-account-create-update-wm5k5"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.135493 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.135584 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc2sm\" (UniqueName: \"kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.236769 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.236849 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m6wt\" (UniqueName: \"kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.236887 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.236942 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc2sm\" (UniqueName: \"kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.238237 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.260102 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc2sm\" (UniqueName: \"kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm\") pod \"glance-db-create-dz4jc\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.265352 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.339055 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m6wt\" (UniqueName: \"kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.339127 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.340104 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.379482 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m6wt\" (UniqueName: \"kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt\") pod \"glance-5387-account-create-update-wm5k5\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.394111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.745401 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dz4jc"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.759302 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-l6vl8"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.760819 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.785988 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l6vl8"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.853286 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfgmc\" (UniqueName: \"kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.853417 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.855189 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a222-account-create-update-qr82t"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.856648 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.864498 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.869985 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a222-account-create-update-qr82t"] Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.908606 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5387-account-create-update-wm5k5"] Mar 18 09:22:03 crc kubenswrapper[4778]: W0318 09:22:03.918360 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod445fcacb_d2c9_4892_89b5_4b2b6e54ebc9.slice/crio-3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798 WatchSource:0}: Error finding container 3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798: Status 404 returned error can't find the container with id 3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798 Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.954886 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.954966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfgmc\" (UniqueName: \"kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.956495 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:03 crc kubenswrapper[4778]: I0318 09:22:03.976703 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfgmc\" (UniqueName: \"kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc\") pod \"keystone-db-create-l6vl8\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.050024 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kj8ww"] Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.051516 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.056001 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg6vw\" (UniqueName: \"kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.056060 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.063896 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0a46-account-create-update-phb5p"] Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.066333 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.075506 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kj8ww"] Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.085265 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.086111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.086448 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0a46-account-create-update-phb5p"] Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.161443 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p78gv\" (UniqueName: \"kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.161710 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.161790 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bfb4\" (UniqueName: \"kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.162057 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg6vw\" (UniqueName: \"kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.162217 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.162302 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.163407 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.188793 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg6vw\" (UniqueName: \"kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw\") pod \"keystone-a222-account-create-update-qr82t\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.198829 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.266118 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.268063 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bfb4\" (UniqueName: \"kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.268313 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.268610 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p78gv\" (UniqueName: \"kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.269609 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.270220 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.304248 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bfb4\" (UniqueName: \"kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4\") pod \"placement-db-create-kj8ww\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.304996 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p78gv\" (UniqueName: \"kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv\") pod \"placement-0a46-account-create-update-phb5p\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.391912 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.416099 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.437390 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.496780 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.497641 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="dnsmasq-dns" containerID="cri-o://1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e" gracePeriod=10 Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.520235 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-nk868" event={"ID":"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d","Type":"ContainerStarted","Data":"6d2d7a8a11e12ed4124ccde5834d93c0bc78bbc0a9c88b791847bb709dbbb116"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.531160 4778 generic.go:334] "Generic (PLEG): container finished" podID="f9222d9a-6507-4c32-9234-2c1c2b27a11e" containerID="c677c641e634b2b60d1bae546bbf8e8d9cc5553eb522dd4d67a26e72bf3f0752" exitCode=0 Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.531281 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dz4jc" event={"ID":"f9222d9a-6507-4c32-9234-2c1c2b27a11e","Type":"ContainerDied","Data":"c677c641e634b2b60d1bae546bbf8e8d9cc5553eb522dd4d67a26e72bf3f0752"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.531321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dz4jc" event={"ID":"f9222d9a-6507-4c32-9234-2c1c2b27a11e","Type":"ContainerStarted","Data":"24b8cc6fbc48de499f68ecf860e8b6053ccab335ce1cff5b8a45ee911c1d9a90"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.562298 4778 generic.go:334] "Generic (PLEG): container finished" podID="ae95589d-d3fb-4254-9fd0-f59203e0e927" containerID="1fd591e2b660a21c69fd30f836286a21f17c533f4da08c01dd7acbea44c0d5f9" exitCode=0 Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.562924 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nnzs2" event={"ID":"ae95589d-d3fb-4254-9fd0-f59203e0e927","Type":"ContainerDied","Data":"1fd591e2b660a21c69fd30f836286a21f17c533f4da08c01dd7acbea44c0d5f9"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.577980 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5387-account-create-update-wm5k5" event={"ID":"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9","Type":"ContainerStarted","Data":"397a643cfe9ac0a1d5786dfe10182c0fd656474f01a6f126523a52404ea544a7"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.578129 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5387-account-create-update-wm5k5" event={"ID":"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9","Type":"ContainerStarted","Data":"3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798"} Mar 18 09:22:04 crc kubenswrapper[4778]: I0318 09:22:04.663493 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l6vl8"] Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.057879 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a222-account-create-update-qr82t"] Mar 18 09:22:05 crc kubenswrapper[4778]: W0318 09:22:05.087190 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7ae14ca_efde_42ba_8edf_7cc34dc31036.slice/crio-d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112 WatchSource:0}: Error finding container d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112: Status 404 returned error can't find the container with id d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.214947 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.231019 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kj8ww"] Mar 18 09:22:05 crc kubenswrapper[4778]: W0318 09:22:05.292988 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24412394_390b_461c_9d18_617eba706adc.slice/crio-a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29 WatchSource:0}: Error finding container a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29: Status 404 returned error can't find the container with id a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.295002 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0a46-account-create-update-phb5p"] Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.316880 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chwdv\" (UniqueName: \"kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv\") pod \"124c0069-debd-459c-9d66-f38d9d096996\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.316976 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config\") pod \"124c0069-debd-459c-9d66-f38d9d096996\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.317073 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc\") pod \"124c0069-debd-459c-9d66-f38d9d096996\" (UID: \"124c0069-debd-459c-9d66-f38d9d096996\") " Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.334745 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv" (OuterVolumeSpecName: "kube-api-access-chwdv") pod "124c0069-debd-459c-9d66-f38d9d096996" (UID: "124c0069-debd-459c-9d66-f38d9d096996"). InnerVolumeSpecName "kube-api-access-chwdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.367504 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config" (OuterVolumeSpecName: "config") pod "124c0069-debd-459c-9d66-f38d9d096996" (UID: "124c0069-debd-459c-9d66-f38d9d096996"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.368757 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "124c0069-debd-459c-9d66-f38d9d096996" (UID: "124c0069-debd-459c-9d66-f38d9d096996"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.418985 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chwdv\" (UniqueName: \"kubernetes.io/projected/124c0069-debd-459c-9d66-f38d9d096996-kube-api-access-chwdv\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.419038 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.419051 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/124c0069-debd-459c-9d66-f38d9d096996-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.589228 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l6vl8" event={"ID":"51a820a6-6a95-4ab7-a9d8-6649fe45464a","Type":"ContainerDied","Data":"2d1134d737bb1ad4d1096a8735192a53e75e70709be8071f894f8def68f8db65"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.589052 4778 generic.go:334] "Generic (PLEG): container finished" podID="51a820a6-6a95-4ab7-a9d8-6649fe45464a" containerID="2d1134d737bb1ad4d1096a8735192a53e75e70709be8071f894f8def68f8db65" exitCode=0 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.590350 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l6vl8" event={"ID":"51a820a6-6a95-4ab7-a9d8-6649fe45464a","Type":"ContainerStarted","Data":"c97c18a12d3c090de835642354d9ea6188d81c28ca151b3eb5f41951eda7e87c"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.594272 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" containerID="6d2d7a8a11e12ed4124ccde5834d93c0bc78bbc0a9c88b791847bb709dbbb116" exitCode=0 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.594711 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-nk868" event={"ID":"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d","Type":"ContainerDied","Data":"6d2d7a8a11e12ed4124ccde5834d93c0bc78bbc0a9c88b791847bb709dbbb116"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.597072 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0a46-account-create-update-phb5p" event={"ID":"24412394-390b-461c-9d18-617eba706adc","Type":"ContainerStarted","Data":"a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.604940 4778 generic.go:334] "Generic (PLEG): container finished" podID="124c0069-debd-459c-9d66-f38d9d096996" containerID="1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e" exitCode=0 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.605033 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" event={"ID":"124c0069-debd-459c-9d66-f38d9d096996","Type":"ContainerDied","Data":"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.605071 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" event={"ID":"124c0069-debd-459c-9d66-f38d9d096996","Type":"ContainerDied","Data":"81379cb4a6ab1c31c5025b2cadc823fb3d4cb38e6f779d5cd8007da3c790b69a"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.605106 4778 scope.go:117] "RemoveContainer" containerID="1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.605793 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xvr5c" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.616478 4778 generic.go:334] "Generic (PLEG): container finished" podID="c7ae14ca-efde-42ba-8edf-7cc34dc31036" containerID="70fd7ebf08e80da75227c830c21c112cbd85c345b44bad8cd8c81cd3f4b7fd7e" exitCode=0 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.616555 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a222-account-create-update-qr82t" event={"ID":"c7ae14ca-efde-42ba-8edf-7cc34dc31036","Type":"ContainerDied","Data":"70fd7ebf08e80da75227c830c21c112cbd85c345b44bad8cd8c81cd3f4b7fd7e"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.616590 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a222-account-create-update-qr82t" event={"ID":"c7ae14ca-efde-42ba-8edf-7cc34dc31036","Type":"ContainerStarted","Data":"d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.618703 4778 generic.go:334] "Generic (PLEG): container finished" podID="445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" containerID="397a643cfe9ac0a1d5786dfe10182c0fd656474f01a6f126523a52404ea544a7" exitCode=0 Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.618801 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5387-account-create-update-wm5k5" event={"ID":"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9","Type":"ContainerDied","Data":"397a643cfe9ac0a1d5786dfe10182c0fd656474f01a6f126523a52404ea544a7"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.623359 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kj8ww" event={"ID":"be311af4-91f5-417e-971b-c9158576ca97","Type":"ContainerStarted","Data":"9b6fa295a9bfec83f890b1dd7210afd8d93f1d1f4da240bfbc29bf8af750edd0"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.623408 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kj8ww" event={"ID":"be311af4-91f5-417e-971b-c9158576ca97","Type":"ContainerStarted","Data":"ea80e6972d772d1aea12da8304c45c18840c5828e5180b99ae338ff0921eae08"} Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.651783 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-0a46-account-create-update-phb5p" podStartSLOduration=1.651752208 podStartE2EDuration="1.651752208s" podCreationTimestamp="2026-03-18 09:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:05.629135051 +0000 UTC m=+1192.203879911" watchObservedRunningTime="2026-03-18 09:22:05.651752208 +0000 UTC m=+1192.226497048" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.653819 4778 scope.go:117] "RemoveContainer" containerID="5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.675419 4778 scope.go:117] "RemoveContainer" containerID="1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e" Mar 18 09:22:05 crc kubenswrapper[4778]: E0318 09:22:05.679318 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e\": container with ID starting with 1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e not found: ID does not exist" containerID="1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.679358 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e"} err="failed to get container status \"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e\": rpc error: code = NotFound desc = could not find container \"1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e\": container with ID starting with 1f7b277ad1603335218dda3fb1d6ee415b4abfd18966fe2e93a86b878c57723e not found: ID does not exist" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.679382 4778 scope.go:117] "RemoveContainer" containerID="5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.686463 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:22:05 crc kubenswrapper[4778]: E0318 09:22:05.695808 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c\": container with ID starting with 5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c not found: ID does not exist" containerID="5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.695860 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c"} err="failed to get container status \"5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c\": rpc error: code = NotFound desc = could not find container \"5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c\": container with ID starting with 5ed662d305e0e956be34aca979fbb334e97403c6f49b4e0abe4d26246e519e1c not found: ID does not exist" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.708127 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xvr5c"] Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.712399 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-kj8ww" podStartSLOduration=1.712320639 podStartE2EDuration="1.712320639s" podCreationTimestamp="2026-03-18 09:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:05.675952887 +0000 UTC m=+1192.250697737" watchObservedRunningTime="2026-03-18 09:22:05.712320639 +0000 UTC m=+1192.287065479" Mar 18 09:22:05 crc kubenswrapper[4778]: I0318 09:22:05.948029 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.027784 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhg95\" (UniqueName: \"kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95\") pod \"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d\" (UID: \"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.033925 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95" (OuterVolumeSpecName: "kube-api-access-dhg95") pod "7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" (UID: "7bcf145e-ae6a-4674-9f79-b6486ec2fa9d"). InnerVolumeSpecName "kube-api-access-dhg95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.129325 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhg95\" (UniqueName: \"kubernetes.io/projected/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d-kube-api-access-dhg95\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.154884 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.165805 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.181460 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.204172 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124c0069-debd-459c-9d66-f38d9d096996" path="/var/lib/kubelet/pods/124c0069-debd-459c-9d66-f38d9d096996/volumes" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232615 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc2sm\" (UniqueName: \"kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm\") pod \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232756 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m6wt\" (UniqueName: \"kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt\") pod \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232833 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts\") pod \"ae95589d-d3fb-4254-9fd0-f59203e0e927\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232874 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tmwp\" (UniqueName: \"kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp\") pod \"ae95589d-d3fb-4254-9fd0-f59203e0e927\" (UID: \"ae95589d-d3fb-4254-9fd0-f59203e0e927\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232929 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts\") pod \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\" (UID: \"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.232954 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts\") pod \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\" (UID: \"f9222d9a-6507-4c32-9234-2c1c2b27a11e\") " Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.233870 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9222d9a-6507-4c32-9234-2c1c2b27a11e" (UID: "f9222d9a-6507-4c32-9234-2c1c2b27a11e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.234258 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" (UID: "445fcacb-d2c9-4892-89b5-4b2b6e54ebc9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.234652 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae95589d-d3fb-4254-9fd0-f59203e0e927" (UID: "ae95589d-d3fb-4254-9fd0-f59203e0e927"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.236752 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt" (OuterVolumeSpecName: "kube-api-access-8m6wt") pod "445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" (UID: "445fcacb-d2c9-4892-89b5-4b2b6e54ebc9"). InnerVolumeSpecName "kube-api-access-8m6wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.237406 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm" (OuterVolumeSpecName: "kube-api-access-bc2sm") pod "f9222d9a-6507-4c32-9234-2c1c2b27a11e" (UID: "f9222d9a-6507-4c32-9234-2c1c2b27a11e"). InnerVolumeSpecName "kube-api-access-bc2sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.237693 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp" (OuterVolumeSpecName: "kube-api-access-8tmwp") pod "ae95589d-d3fb-4254-9fd0-f59203e0e927" (UID: "ae95589d-d3fb-4254-9fd0-f59203e0e927"). InnerVolumeSpecName "kube-api-access-8tmwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335342 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335386 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9222d9a-6507-4c32-9234-2c1c2b27a11e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335399 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc2sm\" (UniqueName: \"kubernetes.io/projected/f9222d9a-6507-4c32-9234-2c1c2b27a11e-kube-api-access-bc2sm\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335412 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m6wt\" (UniqueName: \"kubernetes.io/projected/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9-kube-api-access-8m6wt\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335427 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae95589d-d3fb-4254-9fd0-f59203e0e927-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.335438 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tmwp\" (UniqueName: \"kubernetes.io/projected/ae95589d-d3fb-4254-9fd0-f59203e0e927-kube-api-access-8tmwp\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.636577 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-nk868" event={"ID":"7bcf145e-ae6a-4674-9f79-b6486ec2fa9d","Type":"ContainerDied","Data":"0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163"} Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.637105 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d4f08f11dfe1809584f3cab0e3d071c5c19f66ebf3a573a7b3edd6c9c3b4163" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.637249 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-nk868" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.641014 4778 generic.go:334] "Generic (PLEG): container finished" podID="24412394-390b-461c-9d18-617eba706adc" containerID="39a152a6bb8ed07675c14ece0ad21851da7a9e9a103afe2312ea0254cd99b29c" exitCode=0 Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.641276 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0a46-account-create-update-phb5p" event={"ID":"24412394-390b-461c-9d18-617eba706adc","Type":"ContainerDied","Data":"39a152a6bb8ed07675c14ece0ad21851da7a9e9a103afe2312ea0254cd99b29c"} Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.645569 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dz4jc" event={"ID":"f9222d9a-6507-4c32-9234-2c1c2b27a11e","Type":"ContainerDied","Data":"24b8cc6fbc48de499f68ecf860e8b6053ccab335ce1cff5b8a45ee911c1d9a90"} Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.645597 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b8cc6fbc48de499f68ecf860e8b6053ccab335ce1cff5b8a45ee911c1d9a90" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.645642 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dz4jc" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.647390 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nnzs2" event={"ID":"ae95589d-d3fb-4254-9fd0-f59203e0e927","Type":"ContainerDied","Data":"a2153b59b29a56867da95a8bfa79fff53d74a9d7fba4ed7aa6073d0c04ef364a"} Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.647415 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2153b59b29a56867da95a8bfa79fff53d74a9d7fba4ed7aa6073d0c04ef364a" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.647478 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nnzs2" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.663570 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5387-account-create-update-wm5k5" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.664032 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5387-account-create-update-wm5k5" event={"ID":"445fcacb-d2c9-4892-89b5-4b2b6e54ebc9","Type":"ContainerDied","Data":"3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798"} Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.664073 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f56c6b693b7b9b071967115241709056ce764bedf268186ba682ea2bb87e798" Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.666328 4778 generic.go:334] "Generic (PLEG): container finished" podID="be311af4-91f5-417e-971b-c9158576ca97" containerID="9b6fa295a9bfec83f890b1dd7210afd8d93f1d1f4da240bfbc29bf8af750edd0" exitCode=0 Mar 18 09:22:06 crc kubenswrapper[4778]: I0318 09:22:06.666383 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kj8ww" event={"ID":"be311af4-91f5-417e-971b-c9158576ca97","Type":"ContainerDied","Data":"9b6fa295a9bfec83f890b1dd7210afd8d93f1d1f4da240bfbc29bf8af750edd0"} Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.024723 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.032610 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-s8bkt"] Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.039491 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-s8bkt"] Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.049426 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg6vw\" (UniqueName: \"kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw\") pod \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.049745 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts\") pod \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\" (UID: \"c7ae14ca-efde-42ba-8edf-7cc34dc31036\") " Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.051249 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7ae14ca-efde-42ba-8edf-7cc34dc31036" (UID: "c7ae14ca-efde-42ba-8edf-7cc34dc31036"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.075032 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw" (OuterVolumeSpecName: "kube-api-access-fg6vw") pod "c7ae14ca-efde-42ba-8edf-7cc34dc31036" (UID: "c7ae14ca-efde-42ba-8edf-7cc34dc31036"). InnerVolumeSpecName "kube-api-access-fg6vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.128964 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.151724 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts\") pod \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.151875 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfgmc\" (UniqueName: \"kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc\") pod \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\" (UID: \"51a820a6-6a95-4ab7-a9d8-6649fe45464a\") " Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.152230 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ae14ca-efde-42ba-8edf-7cc34dc31036-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.152242 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg6vw\" (UniqueName: \"kubernetes.io/projected/c7ae14ca-efde-42ba-8edf-7cc34dc31036-kube-api-access-fg6vw\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.152660 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51a820a6-6a95-4ab7-a9d8-6649fe45464a" (UID: "51a820a6-6a95-4ab7-a9d8-6649fe45464a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.156528 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc" (OuterVolumeSpecName: "kube-api-access-jfgmc") pod "51a820a6-6a95-4ab7-a9d8-6649fe45464a" (UID: "51a820a6-6a95-4ab7-a9d8-6649fe45464a"). InnerVolumeSpecName "kube-api-access-jfgmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.254160 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfgmc\" (UniqueName: \"kubernetes.io/projected/51a820a6-6a95-4ab7-a9d8-6649fe45464a-kube-api-access-jfgmc\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.254227 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a820a6-6a95-4ab7-a9d8-6649fe45464a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.679267 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l6vl8" event={"ID":"51a820a6-6a95-4ab7-a9d8-6649fe45464a","Type":"ContainerDied","Data":"c97c18a12d3c090de835642354d9ea6188d81c28ca151b3eb5f41951eda7e87c"} Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.679602 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c97c18a12d3c090de835642354d9ea6188d81c28ca151b3eb5f41951eda7e87c" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.679679 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6vl8" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.682313 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a222-account-create-update-qr82t" event={"ID":"c7ae14ca-efde-42ba-8edf-7cc34dc31036","Type":"ContainerDied","Data":"d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112"} Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.682382 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1026be7b9a9e421d1dd3fdd40e0dad9e9c0eb2a83f1b8aab937d30a4670c112" Mar 18 09:22:07 crc kubenswrapper[4778]: I0318 09:22:07.682471 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a222-account-create-update-qr82t" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.014285 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.067059 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p78gv\" (UniqueName: \"kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv\") pod \"24412394-390b-461c-9d18-617eba706adc\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.067438 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts\") pod \"24412394-390b-461c-9d18-617eba706adc\" (UID: \"24412394-390b-461c-9d18-617eba706adc\") " Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.067874 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24412394-390b-461c-9d18-617eba706adc" (UID: "24412394-390b-461c-9d18-617eba706adc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.085374 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv" (OuterVolumeSpecName: "kube-api-access-p78gv") pod "24412394-390b-461c-9d18-617eba706adc" (UID: "24412394-390b-461c-9d18-617eba706adc"). InnerVolumeSpecName "kube-api-access-p78gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.169500 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24412394-390b-461c-9d18-617eba706adc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.169528 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p78gv\" (UniqueName: \"kubernetes.io/projected/24412394-390b-461c-9d18-617eba706adc-kube-api-access-p78gv\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.196488 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6298370-ed2e-4705-827b-c1a77b03f32a" path="/var/lib/kubelet/pods/a6298370-ed2e-4705-827b-c1a77b03f32a/volumes" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.430026 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.534623 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-b66ph"] Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535024 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a820a6-6a95-4ab7-a9d8-6649fe45464a" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535044 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a820a6-6a95-4ab7-a9d8-6649fe45464a" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535078 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be311af4-91f5-417e-971b-c9158576ca97" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535087 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be311af4-91f5-417e-971b-c9158576ca97" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535097 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9222d9a-6507-4c32-9234-2c1c2b27a11e" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535105 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9222d9a-6507-4c32-9234-2c1c2b27a11e" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535123 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae95589d-d3fb-4254-9fd0-f59203e0e927" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535131 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae95589d-d3fb-4254-9fd0-f59203e0e927" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535147 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ae14ca-efde-42ba-8edf-7cc34dc31036" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535155 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ae14ca-efde-42ba-8edf-7cc34dc31036" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535178 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="init" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535185 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="init" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535215 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" containerName="oc" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535223 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" containerName="oc" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535239 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535247 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535261 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24412394-390b-461c-9d18-617eba706adc" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535269 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="24412394-390b-461c-9d18-617eba706adc" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: E0318 09:22:08.535279 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="dnsmasq-dns" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535287 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="dnsmasq-dns" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535462 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535479 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9222d9a-6507-4c32-9234-2c1c2b27a11e" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535492 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a820a6-6a95-4ab7-a9d8-6649fe45464a" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535502 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="24412394-390b-461c-9d18-617eba706adc" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535515 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ae14ca-efde-42ba-8edf-7cc34dc31036" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535524 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae95589d-d3fb-4254-9fd0-f59203e0e927" containerName="mariadb-account-create-update" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535532 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="be311af4-91f5-417e-971b-c9158576ca97" containerName="mariadb-database-create" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535542 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="124c0069-debd-459c-9d66-f38d9d096996" containerName="dnsmasq-dns" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.535553 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" containerName="oc" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.536158 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.541551 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.541666 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ntc8r" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.564640 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b66ph"] Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.591981 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts\") pod \"be311af4-91f5-417e-971b-c9158576ca97\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.592169 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bfb4\" (UniqueName: \"kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4\") pod \"be311af4-91f5-417e-971b-c9158576ca97\" (UID: \"be311af4-91f5-417e-971b-c9158576ca97\") " Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.601760 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be311af4-91f5-417e-971b-c9158576ca97" (UID: "be311af4-91f5-417e-971b-c9158576ca97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.610491 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4" (OuterVolumeSpecName: "kube-api-access-8bfb4") pod "be311af4-91f5-417e-971b-c9158576ca97" (UID: "be311af4-91f5-417e-971b-c9158576ca97"). InnerVolumeSpecName "kube-api-access-8bfb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.692483 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0a46-account-create-update-phb5p" event={"ID":"24412394-390b-461c-9d18-617eba706adc","Type":"ContainerDied","Data":"a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29"} Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.692509 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0a46-account-create-update-phb5p" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.692525 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8cdfda576bdb60315d008b13d4ba643b525cf074d3f540a2ce9321334d9fb29" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694094 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694153 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kj8ww" event={"ID":"be311af4-91f5-417e-971b-c9158576ca97","Type":"ContainerDied","Data":"ea80e6972d772d1aea12da8304c45c18840c5828e5180b99ae338ff0921eae08"} Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694174 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c69rn\" (UniqueName: \"kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694235 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694255 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694265 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kj8ww" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694316 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be311af4-91f5-417e-971b-c9158576ca97-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694331 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bfb4\" (UniqueName: \"kubernetes.io/projected/be311af4-91f5-417e-971b-c9158576ca97-kube-api-access-8bfb4\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.694174 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea80e6972d772d1aea12da8304c45c18840c5828e5180b99ae338ff0921eae08" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.795821 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.796164 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.796273 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.796318 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c69rn\" (UniqueName: \"kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.800655 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.800708 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.803267 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.817447 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c69rn\" (UniqueName: \"kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn\") pod \"glance-db-sync-b66ph\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:08 crc kubenswrapper[4778]: I0318 09:22:08.875838 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.517780 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b66ph"] Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.536904 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.700225 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-nnzs2"] Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.702263 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b66ph" event={"ID":"5dadb643-21f7-497a-992f-41ab80c704c5","Type":"ContainerStarted","Data":"2d28880ba4925c32ee75f685408b2e6019fc2833229df2500837da232ac04367"} Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.709387 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-nnzs2"] Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.797134 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bmx7j"] Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.798348 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.807065 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.807515 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bmx7j"] Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.942981 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:09 crc kubenswrapper[4778]: I0318 09:22:09.943063 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-569bc\" (UniqueName: \"kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.044136 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.044245 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-569bc\" (UniqueName: \"kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.045345 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.077139 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-569bc\" (UniqueName: \"kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc\") pod \"root-account-create-update-bmx7j\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.152386 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.204397 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae95589d-d3fb-4254-9fd0-f59203e0e927" path="/var/lib/kubelet/pods/ae95589d-d3fb-4254-9fd0-f59203e0e927/volumes" Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.601920 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bmx7j"] Mar 18 09:22:10 crc kubenswrapper[4778]: W0318 09:22:10.607081 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c364f41_96b2_472b_bf96_fbbe1c1c5515.slice/crio-4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5 WatchSource:0}: Error finding container 4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5: Status 404 returned error can't find the container with id 4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5 Mar 18 09:22:10 crc kubenswrapper[4778]: I0318 09:22:10.711634 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bmx7j" event={"ID":"3c364f41-96b2-472b-bf96-fbbe1c1c5515","Type":"ContainerStarted","Data":"4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5"} Mar 18 09:22:11 crc kubenswrapper[4778]: I0318 09:22:11.750609 4778 generic.go:334] "Generic (PLEG): container finished" podID="3c364f41-96b2-472b-bf96-fbbe1c1c5515" containerID="d4dc8e8b710d4699b5bf32fb7126e48abd2e3887409f25b1985152911c6485a4" exitCode=0 Mar 18 09:22:11 crc kubenswrapper[4778]: I0318 09:22:11.750707 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bmx7j" event={"ID":"3c364f41-96b2-472b-bf96-fbbe1c1c5515","Type":"ContainerDied","Data":"d4dc8e8b710d4699b5bf32fb7126e48abd2e3887409f25b1985152911c6485a4"} Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.151429 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.307737 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts\") pod \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.307877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-569bc\" (UniqueName: \"kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc\") pod \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\" (UID: \"3c364f41-96b2-472b-bf96-fbbe1c1c5515\") " Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.309191 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c364f41-96b2-472b-bf96-fbbe1c1c5515" (UID: "3c364f41-96b2-472b-bf96-fbbe1c1c5515"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.330032 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc" (OuterVolumeSpecName: "kube-api-access-569bc") pod "3c364f41-96b2-472b-bf96-fbbe1c1c5515" (UID: "3c364f41-96b2-472b-bf96-fbbe1c1c5515"). InnerVolumeSpecName "kube-api-access-569bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.409936 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c364f41-96b2-472b-bf96-fbbe1c1c5515-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.409970 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-569bc\" (UniqueName: \"kubernetes.io/projected/3c364f41-96b2-472b-bf96-fbbe1c1c5515-kube-api-access-569bc\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.780835 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bmx7j" event={"ID":"3c364f41-96b2-472b-bf96-fbbe1c1c5515","Type":"ContainerDied","Data":"4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5"} Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.780895 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmx7j" Mar 18 09:22:13 crc kubenswrapper[4778]: I0318 09:22:13.780913 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c38f2389e7b143602b7ffeca1f041770927d65b1eec4410ead1aa8f68aa1bf5" Mar 18 09:22:14 crc kubenswrapper[4778]: I0318 09:22:14.577511 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 09:22:17 crc kubenswrapper[4778]: I0318 09:22:17.423824 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fgfk9" podUID="208b26f2-3c91-4966-9d01-8fe73e4a7d87" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.88:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:22:17 crc kubenswrapper[4778]: I0318 09:22:17.534285 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bmx7j"] Mar 18 09:22:17 crc kubenswrapper[4778]: I0318 09:22:17.542003 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bmx7j"] Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.198700 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c364f41-96b2-472b-bf96-fbbe1c1c5515" path="/var/lib/kubelet/pods/3c364f41-96b2-472b-bf96-fbbe1c1c5515/volumes" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.376422 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-djmq6" podUID="f58533cf-4c57-4c3a-b772-e2a488298d7e" containerName="ovn-controller" probeResult="failure" output=< Mar 18 09:22:18 crc kubenswrapper[4778]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 09:22:18 crc kubenswrapper[4778]: > Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.415055 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.455051 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zrlnv" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.779413 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-djmq6-config-6ccgx"] Mar 18 09:22:18 crc kubenswrapper[4778]: E0318 09:22:18.779784 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c364f41-96b2-472b-bf96-fbbe1c1c5515" containerName="mariadb-account-create-update" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.779800 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c364f41-96b2-472b-bf96-fbbe1c1c5515" containerName="mariadb-account-create-update" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.779962 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c364f41-96b2-472b-bf96-fbbe1c1c5515" containerName="mariadb-account-create-update" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.780505 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.782977 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.805220 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djmq6-config-6ccgx"] Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906526 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906629 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906681 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw4kw\" (UniqueName: \"kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906732 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906804 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:18 crc kubenswrapper[4778]: I0318 09:22:18.906866 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008473 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008549 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw4kw\" (UniqueName: \"kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008602 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008635 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008690 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008783 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008937 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.008960 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.009088 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.009463 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.012058 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.037334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw4kw\" (UniqueName: \"kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw\") pod \"ovn-controller-djmq6-config-6ccgx\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:19 crc kubenswrapper[4778]: I0318 09:22:19.125524 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:21 crc kubenswrapper[4778]: I0318 09:22:21.541333 4778 generic.go:334] "Generic (PLEG): container finished" podID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerID="3698e124eba53a15e3f16dfe6346805545443e4b8ce94d12254d508326508979" exitCode=0 Mar 18 09:22:21 crc kubenswrapper[4778]: I0318 09:22:21.541429 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerDied","Data":"3698e124eba53a15e3f16dfe6346805545443e4b8ce94d12254d508326508979"} Mar 18 09:22:21 crc kubenswrapper[4778]: I0318 09:22:21.544302 4778 generic.go:334] "Generic (PLEG): container finished" podID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerID="fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7" exitCode=0 Mar 18 09:22:21 crc kubenswrapper[4778]: I0318 09:22:21.544333 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerDied","Data":"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7"} Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.536498 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zrjls"] Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.537882 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.539598 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.547135 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zrjls"] Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.575862 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7dnt\" (UniqueName: \"kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.575993 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.677501 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7dnt\" (UniqueName: \"kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.678017 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.678839 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.699453 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7dnt\" (UniqueName: \"kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt\") pod \"root-account-create-update-zrjls\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:22 crc kubenswrapper[4778]: I0318 09:22:22.856222 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:23 crc kubenswrapper[4778]: I0318 09:22:23.369805 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-djmq6" podUID="f58533cf-4c57-4c3a-b772-e2a488298d7e" containerName="ovn-controller" probeResult="failure" output=< Mar 18 09:22:23 crc kubenswrapper[4778]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 09:22:23 crc kubenswrapper[4778]: > Mar 18 09:22:25 crc kubenswrapper[4778]: I0318 09:22:25.738080 4778 scope.go:117] "RemoveContainer" containerID="4909b98cff116d6eb4c151d4ba3b46f1a567c070f760a907d7a4e8ea4dca9196" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.180369 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djmq6-config-6ccgx"] Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.380895 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zrjls"] Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.600310 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zrjls" event={"ID":"8560ebac-334f-4332-b324-cdb297a94b1a","Type":"ContainerStarted","Data":"8485de1959de5e473a1a0282e19bec9c8061e5419357cbeb799b7cc895e3b146"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.600835 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zrjls" event={"ID":"8560ebac-334f-4332-b324-cdb297a94b1a","Type":"ContainerStarted","Data":"b8c940ffd2808689a9da3bc2d7f8c95ee43229d10a3c9d4eb5fdacb3256bde90"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.601979 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b66ph" event={"ID":"5dadb643-21f7-497a-992f-41ab80c704c5","Type":"ContainerStarted","Data":"0cbb671a57344b775f4ff6d3749d585e5e115edb6d8d987453203f44ff882ff0"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.604893 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerStarted","Data":"29a1b3eb3844657a811126fd222acecd2b3045c5c60889965b169deb2aec2c9a"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.605330 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.607690 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerStarted","Data":"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.608227 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.612626 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6-config-6ccgx" event={"ID":"20e799eb-9c49-4025-98e1-b25be1bac66a","Type":"ContainerStarted","Data":"a91ae12327b49c9ef11425b6654b97316a81c22c90fc2a91ca50368382d18bdb"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.612661 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6-config-6ccgx" event={"ID":"20e799eb-9c49-4025-98e1-b25be1bac66a","Type":"ContainerStarted","Data":"4282be7a65cf991e1b8ae29fb89dbb383c0cb63e6bfe1799f15c822d2653961d"} Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.628344 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-zrjls" podStartSLOduration=5.628323363 podStartE2EDuration="5.628323363s" podCreationTimestamp="2026-03-18 09:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:27.618573097 +0000 UTC m=+1214.193317947" watchObservedRunningTime="2026-03-18 09:22:27.628323363 +0000 UTC m=+1214.203068203" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.650619 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-djmq6-config-6ccgx" podStartSLOduration=9.650598391 podStartE2EDuration="9.650598391s" podCreationTimestamp="2026-03-18 09:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:27.644307438 +0000 UTC m=+1214.219052288" watchObservedRunningTime="2026-03-18 09:22:27.650598391 +0000 UTC m=+1214.225343231" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.674156 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=58.469613519 podStartE2EDuration="1m9.674129852s" podCreationTimestamp="2026-03-18 09:21:18 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.491991181 +0000 UTC m=+1162.066736021" lastFinishedPulling="2026-03-18 09:21:46.696507514 +0000 UTC m=+1173.271252354" observedRunningTime="2026-03-18 09:22:27.668674453 +0000 UTC m=+1214.243419293" watchObservedRunningTime="2026-03-18 09:22:27.674129852 +0000 UTC m=+1214.248874692" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.705144 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=58.863623155 podStartE2EDuration="1m9.705118937s" podCreationTimestamp="2026-03-18 09:21:18 +0000 UTC" firstStartedPulling="2026-03-18 09:21:35.72219076 +0000 UTC m=+1162.296935600" lastFinishedPulling="2026-03-18 09:21:46.563686552 +0000 UTC m=+1173.138431382" observedRunningTime="2026-03-18 09:22:27.696737979 +0000 UTC m=+1214.271482829" watchObservedRunningTime="2026-03-18 09:22:27.705118937 +0000 UTC m=+1214.279863767" Mar 18 09:22:27 crc kubenswrapper[4778]: I0318 09:22:27.735458 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-b66ph" podStartSLOduration=2.339762672 podStartE2EDuration="19.735433585s" podCreationTimestamp="2026-03-18 09:22:08 +0000 UTC" firstStartedPulling="2026-03-18 09:22:09.536543332 +0000 UTC m=+1196.111288162" lastFinishedPulling="2026-03-18 09:22:26.932214225 +0000 UTC m=+1213.506959075" observedRunningTime="2026-03-18 09:22:27.728281009 +0000 UTC m=+1214.303025859" watchObservedRunningTime="2026-03-18 09:22:27.735433585 +0000 UTC m=+1214.310178455" Mar 18 09:22:28 crc kubenswrapper[4778]: I0318 09:22:28.455656 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-djmq6" Mar 18 09:22:28 crc kubenswrapper[4778]: I0318 09:22:28.623750 4778 generic.go:334] "Generic (PLEG): container finished" podID="8560ebac-334f-4332-b324-cdb297a94b1a" containerID="8485de1959de5e473a1a0282e19bec9c8061e5419357cbeb799b7cc895e3b146" exitCode=0 Mar 18 09:22:28 crc kubenswrapper[4778]: I0318 09:22:28.623843 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zrjls" event={"ID":"8560ebac-334f-4332-b324-cdb297a94b1a","Type":"ContainerDied","Data":"8485de1959de5e473a1a0282e19bec9c8061e5419357cbeb799b7cc895e3b146"} Mar 18 09:22:28 crc kubenswrapper[4778]: I0318 09:22:28.626132 4778 generic.go:334] "Generic (PLEG): container finished" podID="20e799eb-9c49-4025-98e1-b25be1bac66a" containerID="a91ae12327b49c9ef11425b6654b97316a81c22c90fc2a91ca50368382d18bdb" exitCode=0 Mar 18 09:22:28 crc kubenswrapper[4778]: I0318 09:22:28.626229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6-config-6ccgx" event={"ID":"20e799eb-9c49-4025-98e1-b25be1bac66a","Type":"ContainerDied","Data":"a91ae12327b49c9ef11425b6654b97316a81c22c90fc2a91ca50368382d18bdb"} Mar 18 09:22:30 crc kubenswrapper[4778]: I0318 09:22:30.147161 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:22:30 crc kubenswrapper[4778]: I0318 09:22:30.147533 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.022579 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.030761 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135022 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw4kw\" (UniqueName: \"kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135094 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135121 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135258 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7dnt\" (UniqueName: \"kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt\") pod \"8560ebac-334f-4332-b324-cdb297a94b1a\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135332 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135370 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135414 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts\") pod \"8560ebac-334f-4332-b324-cdb297a94b1a\" (UID: \"8560ebac-334f-4332-b324-cdb297a94b1a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.135447 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts\") pod \"20e799eb-9c49-4025-98e1-b25be1bac66a\" (UID: \"20e799eb-9c49-4025-98e1-b25be1bac66a\") " Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.136709 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.138325 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.138363 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.138397 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run" (OuterVolumeSpecName: "var-run") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.138918 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8560ebac-334f-4332-b324-cdb297a94b1a" (UID: "8560ebac-334f-4332-b324-cdb297a94b1a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.139328 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts" (OuterVolumeSpecName: "scripts") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.157334 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt" (OuterVolumeSpecName: "kube-api-access-v7dnt") pod "8560ebac-334f-4332-b324-cdb297a94b1a" (UID: "8560ebac-334f-4332-b324-cdb297a94b1a"). InnerVolumeSpecName "kube-api-access-v7dnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.170449 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw" (OuterVolumeSpecName: "kube-api-access-rw4kw") pod "20e799eb-9c49-4025-98e1-b25be1bac66a" (UID: "20e799eb-9c49-4025-98e1-b25be1bac66a"). InnerVolumeSpecName "kube-api-access-rw4kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.237715 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw4kw\" (UniqueName: \"kubernetes.io/projected/20e799eb-9c49-4025-98e1-b25be1bac66a-kube-api-access-rw4kw\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238716 4778 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238770 4778 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238800 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7dnt\" (UniqueName: \"kubernetes.io/projected/8560ebac-334f-4332-b324-cdb297a94b1a-kube-api-access-v7dnt\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238823 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238841 4778 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20e799eb-9c49-4025-98e1-b25be1bac66a-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238859 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8560ebac-334f-4332-b324-cdb297a94b1a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.238880 4778 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/20e799eb-9c49-4025-98e1-b25be1bac66a-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.658675 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djmq6-config-6ccgx" event={"ID":"20e799eb-9c49-4025-98e1-b25be1bac66a","Type":"ContainerDied","Data":"4282be7a65cf991e1b8ae29fb89dbb383c0cb63e6bfe1799f15c822d2653961d"} Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.659157 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4282be7a65cf991e1b8ae29fb89dbb383c0cb63e6bfe1799f15c822d2653961d" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.659039 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djmq6-config-6ccgx" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.662514 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zrjls" event={"ID":"8560ebac-334f-4332-b324-cdb297a94b1a","Type":"ContainerDied","Data":"b8c940ffd2808689a9da3bc2d7f8c95ee43229d10a3c9d4eb5fdacb3256bde90"} Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.662611 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8c940ffd2808689a9da3bc2d7f8c95ee43229d10a3c9d4eb5fdacb3256bde90" Mar 18 09:22:31 crc kubenswrapper[4778]: I0318 09:22:31.662802 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zrjls" Mar 18 09:22:32 crc kubenswrapper[4778]: I0318 09:22:32.156227 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-djmq6-config-6ccgx"] Mar 18 09:22:32 crc kubenswrapper[4778]: I0318 09:22:32.162053 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-djmq6-config-6ccgx"] Mar 18 09:22:32 crc kubenswrapper[4778]: I0318 09:22:32.196059 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e799eb-9c49-4025-98e1-b25be1bac66a" path="/var/lib/kubelet/pods/20e799eb-9c49-4025-98e1-b25be1bac66a/volumes" Mar 18 09:22:35 crc kubenswrapper[4778]: I0318 09:22:35.715095 4778 generic.go:334] "Generic (PLEG): container finished" podID="5dadb643-21f7-497a-992f-41ab80c704c5" containerID="0cbb671a57344b775f4ff6d3749d585e5e115edb6d8d987453203f44ff882ff0" exitCode=0 Mar 18 09:22:35 crc kubenswrapper[4778]: I0318 09:22:35.715239 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b66ph" event={"ID":"5dadb643-21f7-497a-992f-41ab80c704c5","Type":"ContainerDied","Data":"0cbb671a57344b775f4ff6d3749d585e5e115edb6d8d987453203f44ff882ff0"} Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.121795 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.277963 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c69rn\" (UniqueName: \"kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn\") pod \"5dadb643-21f7-497a-992f-41ab80c704c5\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.278161 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data\") pod \"5dadb643-21f7-497a-992f-41ab80c704c5\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.278259 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data\") pod \"5dadb643-21f7-497a-992f-41ab80c704c5\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.278283 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle\") pod \"5dadb643-21f7-497a-992f-41ab80c704c5\" (UID: \"5dadb643-21f7-497a-992f-41ab80c704c5\") " Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.285289 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn" (OuterVolumeSpecName: "kube-api-access-c69rn") pod "5dadb643-21f7-497a-992f-41ab80c704c5" (UID: "5dadb643-21f7-497a-992f-41ab80c704c5"). InnerVolumeSpecName "kube-api-access-c69rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.286009 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5dadb643-21f7-497a-992f-41ab80c704c5" (UID: "5dadb643-21f7-497a-992f-41ab80c704c5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.302524 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dadb643-21f7-497a-992f-41ab80c704c5" (UID: "5dadb643-21f7-497a-992f-41ab80c704c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.331460 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data" (OuterVolumeSpecName: "config-data") pod "5dadb643-21f7-497a-992f-41ab80c704c5" (UID: "5dadb643-21f7-497a-992f-41ab80c704c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.379863 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.379909 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.379925 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadb643-21f7-497a-992f-41ab80c704c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.379940 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c69rn\" (UniqueName: \"kubernetes.io/projected/5dadb643-21f7-497a-992f-41ab80c704c5-kube-api-access-c69rn\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.735546 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b66ph" event={"ID":"5dadb643-21f7-497a-992f-41ab80c704c5","Type":"ContainerDied","Data":"2d28880ba4925c32ee75f685408b2e6019fc2833229df2500837da232ac04367"} Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.735597 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d28880ba4925c32ee75f685408b2e6019fc2833229df2500837da232ac04367" Mar 18 09:22:37 crc kubenswrapper[4778]: I0318 09:22:37.735665 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b66ph" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.179607 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:22:38 crc kubenswrapper[4778]: E0318 09:22:38.180578 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dadb643-21f7-497a-992f-41ab80c704c5" containerName="glance-db-sync" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180596 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dadb643-21f7-497a-992f-41ab80c704c5" containerName="glance-db-sync" Mar 18 09:22:38 crc kubenswrapper[4778]: E0318 09:22:38.180614 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e799eb-9c49-4025-98e1-b25be1bac66a" containerName="ovn-config" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180622 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e799eb-9c49-4025-98e1-b25be1bac66a" containerName="ovn-config" Mar 18 09:22:38 crc kubenswrapper[4778]: E0318 09:22:38.180636 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8560ebac-334f-4332-b324-cdb297a94b1a" containerName="mariadb-account-create-update" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180644 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8560ebac-334f-4332-b324-cdb297a94b1a" containerName="mariadb-account-create-update" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180825 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dadb643-21f7-497a-992f-41ab80c704c5" containerName="glance-db-sync" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180839 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e799eb-9c49-4025-98e1-b25be1bac66a" containerName="ovn-config" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.180847 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8560ebac-334f-4332-b324-cdb297a94b1a" containerName="mariadb-account-create-update" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.181873 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.256020 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.298930 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.299000 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.299039 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.299998 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.300074 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vb9\" (UniqueName: \"kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.402612 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.402753 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.402804 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vb9\" (UniqueName: \"kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.402895 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.402935 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.403902 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.404104 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.404326 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.404799 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.425629 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vb9\" (UniqueName: \"kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9\") pod \"dnsmasq-dns-554567b4f7-npjfc\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:38 crc kubenswrapper[4778]: I0318 09:22:38.561112 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:39 crc kubenswrapper[4778]: I0318 09:22:39.011496 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:22:39 crc kubenswrapper[4778]: I0318 09:22:39.754444 4778 generic.go:334] "Generic (PLEG): container finished" podID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerID="434cdf2be80022d070ec54085d25f3459b4f9eebfed357f2ab22cbeac663278b" exitCode=0 Mar 18 09:22:39 crc kubenswrapper[4778]: I0318 09:22:39.754495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" event={"ID":"4a490b75-6853-41f7-b5b3-46243c4c2166","Type":"ContainerDied","Data":"434cdf2be80022d070ec54085d25f3459b4f9eebfed357f2ab22cbeac663278b"} Mar 18 09:22:39 crc kubenswrapper[4778]: I0318 09:22:39.755084 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" event={"ID":"4a490b75-6853-41f7-b5b3-46243c4c2166","Type":"ContainerStarted","Data":"5ee3b13ece4d9acc78176c498a90576572d6699436ce526238a6b4027ba90016"} Mar 18 09:22:39 crc kubenswrapper[4778]: I0318 09:22:39.764573 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:22:40 crc kubenswrapper[4778]: I0318 09:22:40.081472 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 09:22:40 crc kubenswrapper[4778]: I0318 09:22:40.765748 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" event={"ID":"4a490b75-6853-41f7-b5b3-46243c4c2166","Type":"ContainerStarted","Data":"28ca12d3a58a03f2fd89d9677d86730c400a702b9f62ca98b238e2b7a92046fd"} Mar 18 09:22:40 crc kubenswrapper[4778]: I0318 09:22:40.765979 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:40 crc kubenswrapper[4778]: I0318 09:22:40.785321 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podStartSLOduration=2.785299626 podStartE2EDuration="2.785299626s" podCreationTimestamp="2026-03-18 09:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:40.783997041 +0000 UTC m=+1227.358741911" watchObservedRunningTime="2026-03-18 09:22:40.785299626 +0000 UTC m=+1227.360044476" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.716699 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sz5dt"] Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.717959 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.729254 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sz5dt"] Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.823227 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b89b-account-create-update-ff8z8"] Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.824572 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.827694 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.844101 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b89b-account-create-update-ff8z8"] Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.864081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nlgg\" (UniqueName: \"kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.864271 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.965941 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nlgg\" (UniqueName: \"kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.966046 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.966084 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.966239 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjmzb\" (UniqueName: \"kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:41 crc kubenswrapper[4778]: I0318 09:22:41.966797 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.006091 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nlgg\" (UniqueName: \"kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg\") pod \"cinder-db-create-sz5dt\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.011929 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2cxtn"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.012842 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.022509 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6980-account-create-update-8lctt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.024696 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.031950 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2cxtn"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.032156 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.043213 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6980-account-create-update-8lctt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.043713 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.069522 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjmzb\" (UniqueName: \"kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.069625 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.071166 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.107940 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjmzb\" (UniqueName: \"kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb\") pod \"cinder-b89b-account-create-update-ff8z8\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.145593 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.158817 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-q979b"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.160530 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.171008 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfjht\" (UniqueName: \"kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.172892 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.172985 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.173006 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-q979b"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.173163 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9kfz\" (UniqueName: \"kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.274895 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9kfz\" (UniqueName: \"kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.275306 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4e8f-account-create-update-ztvnt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.277218 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.280189 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfjht\" (UniqueName: \"kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.280281 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.280344 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88j2m\" (UniqueName: \"kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.280553 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.281817 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.292726 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4e8f-account-create-update-ztvnt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.301687 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.301747 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.302555 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.304902 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfjht\" (UniqueName: \"kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht\") pod \"barbican-db-create-2cxtn\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.310775 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9kfz\" (UniqueName: \"kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz\") pod \"barbican-6980-account-create-update-8lctt\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.316010 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-29tr5"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.317096 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.319991 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.320286 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.320503 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcgh4" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.320708 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.326814 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-29tr5"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.339098 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.352822 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.403152 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.403255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88j2m\" (UniqueName: \"kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.403280 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcbxm\" (UniqueName: \"kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.403313 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.404100 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.433265 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88j2m\" (UniqueName: \"kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m\") pod \"neutron-db-create-q979b\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.505299 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.505388 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.505448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcbxm\" (UniqueName: \"kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.505531 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.505562 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwgrf\" (UniqueName: \"kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.506409 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.523338 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcbxm\" (UniqueName: \"kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm\") pod \"neutron-4e8f-account-create-update-ztvnt\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.543782 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q979b" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.607711 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.607750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwgrf\" (UniqueName: \"kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.607808 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.611640 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.614765 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.622691 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.630734 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwgrf\" (UniqueName: \"kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf\") pod \"keystone-db-sync-29tr5\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.633723 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.669664 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sz5dt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.817559 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sz5dt" event={"ID":"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e","Type":"ContainerStarted","Data":"0165c76373e8bba304aac3a9dbc098778cf0c36e469c92578c0e2cf22008740f"} Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.823480 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2cxtn"] Mar 18 09:22:42 crc kubenswrapper[4778]: W0318 09:22:42.851435 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60fea5d6_a85d_40e3_81ef_1d499ba2ebf7.slice/crio-2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954 WatchSource:0}: Error finding container 2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954: Status 404 returned error can't find the container with id 2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954 Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.885948 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b89b-account-create-update-ff8z8"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.921951 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6980-account-create-update-8lctt"] Mar 18 09:22:42 crc kubenswrapper[4778]: I0318 09:22:42.955793 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-q979b"] Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.227169 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4e8f-account-create-update-ztvnt"] Mar 18 09:22:43 crc kubenswrapper[4778]: W0318 09:22:43.242223 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cf66d17_48b6_4629_ae0c_e270afa0c88a.slice/crio-bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520 WatchSource:0}: Error finding container bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520: Status 404 returned error can't find the container with id bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.319059 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-29tr5"] Mar 18 09:22:43 crc kubenswrapper[4778]: W0318 09:22:43.328620 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb6efa68_d15c_4d69_bd52_853a7cef8299.slice/crio-2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba WatchSource:0}: Error finding container 2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba: Status 404 returned error can't find the container with id 2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.827248 4778 generic.go:334] "Generic (PLEG): container finished" podID="320c5adc-a7d8-47a3-893b-7614c755446d" containerID="61d521ae036849914a4701bb867f10a55ba71a80cea0eab40620c4a6aa10638d" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.827557 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b89b-account-create-update-ff8z8" event={"ID":"320c5adc-a7d8-47a3-893b-7614c755446d","Type":"ContainerDied","Data":"61d521ae036849914a4701bb867f10a55ba71a80cea0eab40620c4a6aa10638d"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.827628 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b89b-account-create-update-ff8z8" event={"ID":"320c5adc-a7d8-47a3-893b-7614c755446d","Type":"ContainerStarted","Data":"d5bd6041f3d6a690cef4f200e37ab1d7e48da6ea531c761369a864292600b47c"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.829721 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-29tr5" event={"ID":"bb6efa68-d15c-4d69-bd52-853a7cef8299","Type":"ContainerStarted","Data":"2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.847553 4778 generic.go:334] "Generic (PLEG): container finished" podID="35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" containerID="70d53574867291895895df87d2a68bed084a005ff2e35622dba06f6dac00a1ee" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.847918 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sz5dt" event={"ID":"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e","Type":"ContainerDied","Data":"70d53574867291895895df87d2a68bed084a005ff2e35622dba06f6dac00a1ee"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.850268 4778 generic.go:334] "Generic (PLEG): container finished" podID="9719662a-4248-4c3c-860b-1a9e6547876b" containerID="76d9700b7eab0fc318bf79deafd901e277918a900858f790ff6f7d08ee5e2133" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.850421 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6980-account-create-update-8lctt" event={"ID":"9719662a-4248-4c3c-860b-1a9e6547876b","Type":"ContainerDied","Data":"76d9700b7eab0fc318bf79deafd901e277918a900858f790ff6f7d08ee5e2133"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.850455 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6980-account-create-update-8lctt" event={"ID":"9719662a-4248-4c3c-860b-1a9e6547876b","Type":"ContainerStarted","Data":"6fb547a0f07403671e731bc4adef6aaddb845cc7a67233c9c5013a336629fe47"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.852678 4778 generic.go:334] "Generic (PLEG): container finished" podID="60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" containerID="aa018cf8a109c6b5750c2118b0eb74eb108759f278376c856e84638cf2d31164" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.852758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2cxtn" event={"ID":"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7","Type":"ContainerDied","Data":"aa018cf8a109c6b5750c2118b0eb74eb108759f278376c856e84638cf2d31164"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.852807 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2cxtn" event={"ID":"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7","Type":"ContainerStarted","Data":"2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.855509 4778 generic.go:334] "Generic (PLEG): container finished" podID="dca6e4b2-4722-4a45-b577-33f3c5090fc3" containerID="c1c9a9ac26d842e9f380c2bc10d90467713b713ce0c3ac97f08ea834682773ee" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.855588 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q979b" event={"ID":"dca6e4b2-4722-4a45-b577-33f3c5090fc3","Type":"ContainerDied","Data":"c1c9a9ac26d842e9f380c2bc10d90467713b713ce0c3ac97f08ea834682773ee"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.855610 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q979b" event={"ID":"dca6e4b2-4722-4a45-b577-33f3c5090fc3","Type":"ContainerStarted","Data":"90b9fbf4c7f0c1bd9271bb45aa6b12e259b1bdfb733dd018092b94b0a0987c32"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.861419 4778 generic.go:334] "Generic (PLEG): container finished" podID="7cf66d17-48b6-4629-ae0c-e270afa0c88a" containerID="1b70618e3e5fc20f170a550bf06d195893c5a61a58727ad242d6881bbcef4e7a" exitCode=0 Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.861463 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e8f-account-create-update-ztvnt" event={"ID":"7cf66d17-48b6-4629-ae0c-e270afa0c88a","Type":"ContainerDied","Data":"1b70618e3e5fc20f170a550bf06d195893c5a61a58727ad242d6881bbcef4e7a"} Mar 18 09:22:43 crc kubenswrapper[4778]: I0318 09:22:43.861489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e8f-account-create-update-ztvnt" event={"ID":"7cf66d17-48b6-4629-ae0c-e270afa0c88a","Type":"ContainerStarted","Data":"bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.787041 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.800960 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.842824 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.851083 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q979b" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.868739 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.905451 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.907382 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcbxm\" (UniqueName: \"kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm\") pod \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.907449 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts\") pod \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.907503 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nlgg\" (UniqueName: \"kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg\") pod \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.908752 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dca6e4b2-4722-4a45-b577-33f3c5090fc3" (UID: "dca6e4b2-4722-4a45-b577-33f3c5090fc3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.910297 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts\") pod \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\" (UID: \"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.910362 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9kfz\" (UniqueName: \"kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz\") pod \"9719662a-4248-4c3c-860b-1a9e6547876b\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.910413 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts\") pod \"9719662a-4248-4c3c-860b-1a9e6547876b\" (UID: \"9719662a-4248-4c3c-860b-1a9e6547876b\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.910483 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88j2m\" (UniqueName: \"kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m\") pod \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\" (UID: \"dca6e4b2-4722-4a45-b577-33f3c5090fc3\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.910542 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts\") pod \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\" (UID: \"7cf66d17-48b6-4629-ae0c-e270afa0c88a\") " Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.911915 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca6e4b2-4722-4a45-b577-33f3c5090fc3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.914975 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2cxtn" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.915147 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" (UID: "35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.915796 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2cxtn" event={"ID":"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7","Type":"ContainerDied","Data":"2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.915912 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e7478ba77bcc10d706da8ee103c2baadaecc953ed1cb3b02addae36d8d8e954" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.916114 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9719662a-4248-4c3c-860b-1a9e6547876b" (UID: "9719662a-4248-4c3c-860b-1a9e6547876b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.916283 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7cf66d17-48b6-4629-ae0c-e270afa0c88a" (UID: "7cf66d17-48b6-4629-ae0c-e270afa0c88a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.919708 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q979b" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.920469 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q979b" event={"ID":"dca6e4b2-4722-4a45-b577-33f3c5090fc3","Type":"ContainerDied","Data":"90b9fbf4c7f0c1bd9271bb45aa6b12e259b1bdfb733dd018092b94b0a0987c32"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.920517 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90b9fbf4c7f0c1bd9271bb45aa6b12e259b1bdfb733dd018092b94b0a0987c32" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.921515 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m" (OuterVolumeSpecName: "kube-api-access-88j2m") pod "dca6e4b2-4722-4a45-b577-33f3c5090fc3" (UID: "dca6e4b2-4722-4a45-b577-33f3c5090fc3"). InnerVolumeSpecName "kube-api-access-88j2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.924247 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e8f-account-create-update-ztvnt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.924330 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e8f-account-create-update-ztvnt" event={"ID":"7cf66d17-48b6-4629-ae0c-e270afa0c88a","Type":"ContainerDied","Data":"bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.924352 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb44bfd1417a265b81c690612f8175c586724077fee8efd10a4a946e20794520" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.924672 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm" (OuterVolumeSpecName: "kube-api-access-vcbxm") pod "7cf66d17-48b6-4629-ae0c-e270afa0c88a" (UID: "7cf66d17-48b6-4629-ae0c-e270afa0c88a"). InnerVolumeSpecName "kube-api-access-vcbxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.925439 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg" (OuterVolumeSpecName: "kube-api-access-7nlgg") pod "35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" (UID: "35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e"). InnerVolumeSpecName "kube-api-access-7nlgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.927304 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz" (OuterVolumeSpecName: "kube-api-access-r9kfz") pod "9719662a-4248-4c3c-860b-1a9e6547876b" (UID: "9719662a-4248-4c3c-860b-1a9e6547876b"). InnerVolumeSpecName "kube-api-access-r9kfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.940955 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b89b-account-create-update-ff8z8" event={"ID":"320c5adc-a7d8-47a3-893b-7614c755446d","Type":"ContainerDied","Data":"d5bd6041f3d6a690cef4f200e37ab1d7e48da6ea531c761369a864292600b47c"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.941003 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89b-account-create-update-ff8z8" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.941015 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5bd6041f3d6a690cef4f200e37ab1d7e48da6ea531c761369a864292600b47c" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.943854 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sz5dt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.947336 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sz5dt" event={"ID":"35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e","Type":"ContainerDied","Data":"0165c76373e8bba304aac3a9dbc098778cf0c36e469c92578c0e2cf22008740f"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.947689 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0165c76373e8bba304aac3a9dbc098778cf0c36e469c92578c0e2cf22008740f" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.958402 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6980-account-create-update-8lctt" Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.958731 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6980-account-create-update-8lctt" event={"ID":"9719662a-4248-4c3c-860b-1a9e6547876b","Type":"ContainerDied","Data":"6fb547a0f07403671e731bc4adef6aaddb845cc7a67233c9c5013a336629fe47"} Mar 18 09:22:47 crc kubenswrapper[4778]: I0318 09:22:47.958803 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fb547a0f07403671e731bc4adef6aaddb845cc7a67233c9c5013a336629fe47" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.012529 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfjht\" (UniqueName: \"kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht\") pod \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.012722 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjmzb\" (UniqueName: \"kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb\") pod \"320c5adc-a7d8-47a3-893b-7614c755446d\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.012813 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts\") pod \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\" (UID: \"60fea5d6-a85d-40e3-81ef-1d499ba2ebf7\") " Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.012836 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts\") pod \"320c5adc-a7d8-47a3-893b-7614c755446d\" (UID: \"320c5adc-a7d8-47a3-893b-7614c755446d\") " Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013156 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013175 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9kfz\" (UniqueName: \"kubernetes.io/projected/9719662a-4248-4c3c-860b-1a9e6547876b-kube-api-access-r9kfz\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013188 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9719662a-4248-4c3c-860b-1a9e6547876b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013216 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88j2m\" (UniqueName: \"kubernetes.io/projected/dca6e4b2-4722-4a45-b577-33f3c5090fc3-kube-api-access-88j2m\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013229 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf66d17-48b6-4629-ae0c-e270afa0c88a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013243 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcbxm\" (UniqueName: \"kubernetes.io/projected/7cf66d17-48b6-4629-ae0c-e270afa0c88a-kube-api-access-vcbxm\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013257 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nlgg\" (UniqueName: \"kubernetes.io/projected/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e-kube-api-access-7nlgg\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.013373 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" (UID: "60fea5d6-a85d-40e3-81ef-1d499ba2ebf7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.014272 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "320c5adc-a7d8-47a3-893b-7614c755446d" (UID: "320c5adc-a7d8-47a3-893b-7614c755446d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.016172 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht" (OuterVolumeSpecName: "kube-api-access-vfjht") pod "60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" (UID: "60fea5d6-a85d-40e3-81ef-1d499ba2ebf7"). InnerVolumeSpecName "kube-api-access-vfjht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.017631 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb" (OuterVolumeSpecName: "kube-api-access-hjmzb") pod "320c5adc-a7d8-47a3-893b-7614c755446d" (UID: "320c5adc-a7d8-47a3-893b-7614c755446d"). InnerVolumeSpecName "kube-api-access-hjmzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.115579 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.116098 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320c5adc-a7d8-47a3-893b-7614c755446d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.116165 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfjht\" (UniqueName: \"kubernetes.io/projected/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7-kube-api-access-vfjht\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.116274 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjmzb\" (UniqueName: \"kubernetes.io/projected/320c5adc-a7d8-47a3-893b-7614c755446d-kube-api-access-hjmzb\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.563937 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.643265 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.643519 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-jt8gb" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="dnsmasq-dns" containerID="cri-o://0cb5262a9d3d057ca53ec602e21c523692c8679333829bacc24282be5307d258" gracePeriod=10 Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.979184 4778 generic.go:334] "Generic (PLEG): container finished" podID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerID="0cb5262a9d3d057ca53ec602e21c523692c8679333829bacc24282be5307d258" exitCode=0 Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.979641 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jt8gb" event={"ID":"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5","Type":"ContainerDied","Data":"0cb5262a9d3d057ca53ec602e21c523692c8679333829bacc24282be5307d258"} Mar 18 09:22:48 crc kubenswrapper[4778]: I0318 09:22:48.981296 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-29tr5" event={"ID":"bb6efa68-d15c-4d69-bd52-853a7cef8299","Type":"ContainerStarted","Data":"8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d"} Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.001316 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-29tr5" podStartSLOduration=2.7645939459999997 podStartE2EDuration="7.001293304s" podCreationTimestamp="2026-03-18 09:22:42 +0000 UTC" firstStartedPulling="2026-03-18 09:22:43.3333647 +0000 UTC m=+1229.908109540" lastFinishedPulling="2026-03-18 09:22:47.570064048 +0000 UTC m=+1234.144808898" observedRunningTime="2026-03-18 09:22:49.000539093 +0000 UTC m=+1235.575283953" watchObservedRunningTime="2026-03-18 09:22:49.001293304 +0000 UTC m=+1235.576038144" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.181777 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.245326 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc\") pod \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.245383 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config\") pod \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.245595 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb\") pod \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.245624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb\") pod \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.245725 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqgnd\" (UniqueName: \"kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd\") pod \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\" (UID: \"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5\") " Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.260774 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd" (OuterVolumeSpecName: "kube-api-access-qqgnd") pod "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" (UID: "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5"). InnerVolumeSpecName "kube-api-access-qqgnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.316164 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" (UID: "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.345661 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config" (OuterVolumeSpecName: "config") pod "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" (UID: "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.347997 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqgnd\" (UniqueName: \"kubernetes.io/projected/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-kube-api-access-qqgnd\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.348036 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.348049 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.348695 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" (UID: "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.361263 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" (UID: "d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.450312 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.450698 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.995499 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jt8gb" event={"ID":"d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5","Type":"ContainerDied","Data":"95b556045a6555de0e6526548932d8f77419a56ad86f2e3afbe7000b4b9694c6"} Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.995633 4778 scope.go:117] "RemoveContainer" containerID="0cb5262a9d3d057ca53ec602e21c523692c8679333829bacc24282be5307d258" Mar 18 09:22:49 crc kubenswrapper[4778]: I0318 09:22:49.995528 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jt8gb" Mar 18 09:22:50 crc kubenswrapper[4778]: I0318 09:22:50.028620 4778 scope.go:117] "RemoveContainer" containerID="67e25fe46d44f377f33f04f4df4c1bf2f96243fad2d8a6135f426851e5ac8c58" Mar 18 09:22:50 crc kubenswrapper[4778]: I0318 09:22:50.044786 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:22:50 crc kubenswrapper[4778]: I0318 09:22:50.068558 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jt8gb"] Mar 18 09:22:50 crc kubenswrapper[4778]: I0318 09:22:50.196519 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" path="/var/lib/kubelet/pods/d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5/volumes" Mar 18 09:22:51 crc kubenswrapper[4778]: E0318 09:22:51.006658 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb6efa68_d15c_4d69_bd52_853a7cef8299.slice/crio-8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb6efa68_d15c_4d69_bd52_853a7cef8299.slice/crio-conmon-8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d.scope\": RecentStats: unable to find data in memory cache]" Mar 18 09:22:51 crc kubenswrapper[4778]: I0318 09:22:51.010110 4778 generic.go:334] "Generic (PLEG): container finished" podID="bb6efa68-d15c-4d69-bd52-853a7cef8299" containerID="8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d" exitCode=0 Mar 18 09:22:51 crc kubenswrapper[4778]: I0318 09:22:51.010176 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-29tr5" event={"ID":"bb6efa68-d15c-4d69-bd52-853a7cef8299","Type":"ContainerDied","Data":"8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d"} Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.366495 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.504298 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwgrf\" (UniqueName: \"kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf\") pod \"bb6efa68-d15c-4d69-bd52-853a7cef8299\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.504436 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle\") pod \"bb6efa68-d15c-4d69-bd52-853a7cef8299\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.504515 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data\") pod \"bb6efa68-d15c-4d69-bd52-853a7cef8299\" (UID: \"bb6efa68-d15c-4d69-bd52-853a7cef8299\") " Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.511029 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf" (OuterVolumeSpecName: "kube-api-access-nwgrf") pod "bb6efa68-d15c-4d69-bd52-853a7cef8299" (UID: "bb6efa68-d15c-4d69-bd52-853a7cef8299"). InnerVolumeSpecName "kube-api-access-nwgrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.534718 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb6efa68-d15c-4d69-bd52-853a7cef8299" (UID: "bb6efa68-d15c-4d69-bd52-853a7cef8299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.564459 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data" (OuterVolumeSpecName: "config-data") pod "bb6efa68-d15c-4d69-bd52-853a7cef8299" (UID: "bb6efa68-d15c-4d69-bd52-853a7cef8299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.607132 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.607174 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwgrf\" (UniqueName: \"kubernetes.io/projected/bb6efa68-d15c-4d69-bd52-853a7cef8299-kube-api-access-nwgrf\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:52 crc kubenswrapper[4778]: I0318 09:22:52.607187 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6efa68-d15c-4d69-bd52-853a7cef8299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.032500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-29tr5" event={"ID":"bb6efa68-d15c-4d69-bd52-853a7cef8299","Type":"ContainerDied","Data":"2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba"} Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.032568 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a27443ef48ba4cd692dcd39b8ef54b2fc528f5d9adc50b0359b82445083a3ba" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.032573 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-29tr5" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219374 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219787 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219810 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219827 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320c5adc-a7d8-47a3-893b-7614c755446d" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219839 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="320c5adc-a7d8-47a3-893b-7614c755446d" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219853 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219861 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219876 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="init" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219883 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="init" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219902 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6efa68-d15c-4d69-bd52-853a7cef8299" containerName="keystone-db-sync" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219910 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6efa68-d15c-4d69-bd52-853a7cef8299" containerName="keystone-db-sync" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219921 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9719662a-4248-4c3c-860b-1a9e6547876b" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219929 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9719662a-4248-4c3c-860b-1a9e6547876b" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219944 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="dnsmasq-dns" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219951 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="dnsmasq-dns" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219965 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf66d17-48b6-4629-ae0c-e270afa0c88a" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219973 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf66d17-48b6-4629-ae0c-e270afa0c88a" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: E0318 09:22:53.219987 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca6e4b2-4722-4a45-b577-33f3c5090fc3" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.219995 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca6e4b2-4722-4a45-b577-33f3c5090fc3" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220198 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6efa68-d15c-4d69-bd52-853a7cef8299" containerName="keystone-db-sync" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220310 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9719662a-4248-4c3c-860b-1a9e6547876b" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220324 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="320c5adc-a7d8-47a3-893b-7614c755446d" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220332 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca6e4b2-4722-4a45-b577-33f3c5090fc3" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220341 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d539dcf0-c5ce-4c6c-b367-e5c3d7dac5d5" containerName="dnsmasq-dns" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220356 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220373 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf66d17-48b6-4629-ae0c-e270afa0c88a" containerName="mariadb-account-create-update" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.220385 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" containerName="mariadb-database-create" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.228537 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.239308 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pttzb"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.240552 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.243976 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcgh4" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.244241 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.244395 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.245145 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.246121 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.258691 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pttzb"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.312321 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.325890 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.325962 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326010 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npsqv\" (UniqueName: \"kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326042 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326088 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326120 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326163 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9lph\" (UniqueName: \"kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326246 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326283 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326677 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.326723 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445072 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445137 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9lph\" (UniqueName: \"kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445182 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445612 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445665 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445694 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445772 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445823 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npsqv\" (UniqueName: \"kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445857 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.445891 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.446955 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.449386 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-zbghp"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.451326 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.459874 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.460474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.461402 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.461722 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.463699 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.463694 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.466681 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.471266 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.472701 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.475288 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.475545 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tvhb7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.476767 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.490498 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.499508 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.499705 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.499819 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.499986 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-f4kcp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.500426 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zbghp"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.520262 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.523051 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npsqv\" (UniqueName: \"kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv\") pod \"dnsmasq-dns-67795cd9-m2fhf\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.523463 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9lph\" (UniqueName: \"kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph\") pod \"keystone-bootstrap-pttzb\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.525898 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jb4ss"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.526905 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.534394 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.534593 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r86tv" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.534743 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.542422 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jb4ss"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550008 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550118 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550148 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550167 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcdfb\" (UniqueName: \"kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550192 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550226 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550252 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzl9p\" (UniqueName: \"kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.550269 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.553955 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.575865 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.629783 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651248 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651296 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651469 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9z5l\" (UniqueName: \"kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651638 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651670 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651702 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651747 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651772 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcdfb\" (UniqueName: \"kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651825 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651842 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651890 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzl9p\" (UniqueName: \"kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.651950 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.652813 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.653054 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.656507 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.660981 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.667847 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.668493 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.688470 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-98prp"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.690548 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.695261 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-66r47" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.695458 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.703871 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcdfb\" (UniqueName: \"kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb\") pod \"neutron-db-sync-zbghp\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.721785 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-98prp"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.727637 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzl9p\" (UniqueName: \"kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p\") pod \"horizon-67f887b5-9vsz7\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.747277 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2p9jg"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.748289 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.762692 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4lsnj" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.762939 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763422 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763187 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763565 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763768 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763853 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763891 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763908 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gnxr\" (UniqueName: \"kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.763976 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9z5l\" (UniqueName: \"kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.765064 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.772839 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.780511 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.780974 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.782570 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:53 crc kubenswrapper[4778]: I0318 09:22:53.781028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.800022 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9z5l\" (UniqueName: \"kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.814087 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2p9jg"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.814217 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data\") pod \"cinder-db-sync-jb4ss\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.849260 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.865242 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.866636 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.867992 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7f7g\" (UniqueName: \"kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868014 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868059 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gnxr\" (UniqueName: \"kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868088 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868104 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868153 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868193 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868226 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868249 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868285 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lwww\" (UniqueName: \"kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868320 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.868338 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.884059 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.884440 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.896712 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gnxr\" (UniqueName: \"kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr\") pod \"barbican-db-sync-98prp\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.896767 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.905091 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.907737 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.909644 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.909850 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.922155 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969818 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969871 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lwww\" (UniqueName: \"kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969890 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969909 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969934 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969967 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7f7g\" (UniqueName: \"kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.969982 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970003 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psmpb\" (UniqueName: \"kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970035 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970052 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970067 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970091 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970130 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970149 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970166 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.970918 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.971583 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.971628 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zbghp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.972539 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.972655 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.977498 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.978004 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.980826 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.989756 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.989878 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.994591 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lwww\" (UniqueName: \"kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww\") pod \"dnsmasq-dns-5b6dbdb6f5-nz24n\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:53.994846 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7f7g\" (UniqueName: \"kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g\") pod \"placement-db-sync-2p9jg\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.008687 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.045863 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-98prp" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071310 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071374 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t77rr\" (UniqueName: \"kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071400 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071431 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psmpb\" (UniqueName: \"kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071468 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071493 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071554 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071571 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071600 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071627 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.071655 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.072917 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.076875 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.078886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.083555 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.098734 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psmpb\" (UniqueName: \"kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb\") pod \"horizon-568467c8dc-v4vlb\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.101347 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2p9jg" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.158345 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173004 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173081 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173164 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173207 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173247 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173266 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t77rr\" (UniqueName: \"kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.173296 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.180238 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.181904 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.183328 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.183602 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.184406 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.190663 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.204400 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.208962 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t77rr\" (UniqueName: \"kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr\") pod \"ceilometer-0\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:54.293789 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.422514 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.457767 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.460726 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.495050 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.598989 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.599084 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.599115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.599145 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l9tz\" (UniqueName: \"kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.599172 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.700547 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.700643 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.700689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.700729 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l9tz\" (UniqueName: \"kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.700768 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.701430 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.702170 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.703458 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.709075 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.727889 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l9tz\" (UniqueName: \"kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz\") pod \"horizon-7967bcbb45-bl6c8\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.783484 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.833332 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jb4ss"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.852453 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:22:55 crc kubenswrapper[4778]: W0318 09:22:55.878646 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfba399d9_71ac_41c3_912f_32ccc7fc6190.slice/crio-fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50 WatchSource:0}: Error finding container fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50: Status 404 returned error can't find the container with id fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50 Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.892455 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.907027 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:22:55 crc kubenswrapper[4778]: W0318 09:22:55.910515 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20eafe8e_c0b9_4463_bc12_8c0cd0359968.slice/crio-f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b WatchSource:0}: Error finding container f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b: Status 404 returned error can't find the container with id f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.919286 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zbghp"] Mar 18 09:22:55 crc kubenswrapper[4778]: W0318 09:22:55.925896 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d42905f_c189_4021_834d_f2a81dae5a4a.slice/crio-0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40 WatchSource:0}: Error finding container 0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40: Status 404 returned error can't find the container with id 0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40 Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.928073 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.935604 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2p9jg"] Mar 18 09:22:55 crc kubenswrapper[4778]: W0318 09:22:55.940292 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf507e196_94ca_4c4a_91f4_de3587084d30.slice/crio-47668775c33692efd81b7bc7a0bfacb000692e6da118d403a58db7fb6562c979 WatchSource:0}: Error finding container 47668775c33692efd81b7bc7a0bfacb000692e6da118d403a58db7fb6562c979: Status 404 returned error can't find the container with id 47668775c33692efd81b7bc7a0bfacb000692e6da118d403a58db7fb6562c979 Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.943270 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.950771 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-98prp"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.969052 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pttzb"] Mar 18 09:22:55 crc kubenswrapper[4778]: I0318 09:22:55.983180 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.162507 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zbghp" event={"ID":"4d42905f-c189-4021-834d-f2a81dae5a4a","Type":"ContainerStarted","Data":"0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.185100 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" event={"ID":"45f70c00-938c-4f67-9c5b-4a88c90b62ae","Type":"ContainerStarted","Data":"2ad9afed41c247bfddfd9b123a9c47b45acee2887be9ed80d8c3249a1334c51d"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.206547 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jb4ss" event={"ID":"fba399d9-71ac-41c3-912f-32ccc7fc6190","Type":"ContainerStarted","Data":"fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207375 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-568467c8dc-v4vlb" event={"ID":"14a62749-8336-4894-a162-85350096aef4","Type":"ContainerStarted","Data":"72adaf6b6c1593294d8954a516b806ad25d5b5421c025d726c18c87806d6bbc1"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2p9jg" event={"ID":"20eafe8e-c0b9-4463-bc12-8c0cd0359968","Type":"ContainerStarted","Data":"f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207641 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-98prp" event={"ID":"4135fc20-df28-4f8d-b244-aedd5ed57cc2","Type":"ContainerStarted","Data":"49a7b38747f8562cd7a4e6e050cec3d81202329adfd053459a59103fb40b4766"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207746 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" event={"ID":"0ccbef26-8b5f-4b83-885f-3c074207eb73","Type":"ContainerStarted","Data":"198149700e09863ba5824c3da59cdfa277dbef6c16734a23ecc5c937b970d9dc"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerStarted","Data":"08fbe88aeb204ddd782e3073f280061d837a707d2c10f9b95b4eb6828823ed41"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.207974 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pttzb" event={"ID":"835f3aad-57da-48e7-ac5a-f0635ee9bc98","Type":"ContainerStarted","Data":"5f8d8cdbe6b7f6fcfb01aff3626cd1ee347d0e895f6332948b61152cbec6e222"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.208101 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67f887b5-9vsz7" event={"ID":"f507e196-94ca-4c4a-91f4-de3587084d30","Type":"ContainerStarted","Data":"47668775c33692efd81b7bc7a0bfacb000692e6da118d403a58db7fb6562c979"} Mar 18 09:22:56 crc kubenswrapper[4778]: I0318 09:22:56.434657 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:22:56 crc kubenswrapper[4778]: W0318 09:22:56.460340 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56a6e416_3b49_4f07_a2ac_7fd1a4f58fe2.slice/crio-0cbfa7c707107fef7fba1585919c6375feaacdc7b31da820c5b759e556860c71 WatchSource:0}: Error finding container 0cbfa7c707107fef7fba1585919c6375feaacdc7b31da820c5b759e556860c71: Status 404 returned error can't find the container with id 0cbfa7c707107fef7fba1585919c6375feaacdc7b31da820c5b759e556860c71 Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.217727 4778 generic.go:334] "Generic (PLEG): container finished" podID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerID="2b8a3d906ce9aeea7f7d488e85e8f58f0a3291f6ef7ed6a10151c8afd47df1d7" exitCode=0 Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.218013 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" event={"ID":"0ccbef26-8b5f-4b83-885f-3c074207eb73","Type":"ContainerDied","Data":"2b8a3d906ce9aeea7f7d488e85e8f58f0a3291f6ef7ed6a10151c8afd47df1d7"} Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.226030 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pttzb" event={"ID":"835f3aad-57da-48e7-ac5a-f0635ee9bc98","Type":"ContainerStarted","Data":"b11383263b8711730419b8679c1b55038dd677778bfced742aebd89591fb64cf"} Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.227936 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerStarted","Data":"0cbfa7c707107fef7fba1585919c6375feaacdc7b31da820c5b759e556860c71"} Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.231151 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zbghp" event={"ID":"4d42905f-c189-4021-834d-f2a81dae5a4a","Type":"ContainerStarted","Data":"abc212acc9fea22cd31581d6e2bb923603370c1ccdef0851c69537c07eedf089"} Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.252058 4778 generic.go:334] "Generic (PLEG): container finished" podID="45f70c00-938c-4f67-9c5b-4a88c90b62ae" containerID="31f5aba8b630d38b7c7e2d0b285363c945e3c560fad8314e046d23e6b99ea03e" exitCode=0 Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.252131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" event={"ID":"45f70c00-938c-4f67-9c5b-4a88c90b62ae","Type":"ContainerDied","Data":"31f5aba8b630d38b7c7e2d0b285363c945e3c560fad8314e046d23e6b99ea03e"} Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.277399 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pttzb" podStartSLOduration=4.277374507 podStartE2EDuration="4.277374507s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:57.265393901 +0000 UTC m=+1243.840138751" watchObservedRunningTime="2026-03-18 09:22:57.277374507 +0000 UTC m=+1243.852119347" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.283157 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-zbghp" podStartSLOduration=4.283144124 podStartE2EDuration="4.283144124s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:57.281551082 +0000 UTC m=+1243.856295932" watchObservedRunningTime="2026-03-18 09:22:57.283144124 +0000 UTC m=+1243.857888964" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.801029 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.826439 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config\") pod \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.826532 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc\") pod \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.827070 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npsqv\" (UniqueName: \"kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv\") pod \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.827163 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb\") pod \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.851846 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config" (OuterVolumeSpecName: "config") pod "45f70c00-938c-4f67-9c5b-4a88c90b62ae" (UID: "45f70c00-938c-4f67-9c5b-4a88c90b62ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.860031 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45f70c00-938c-4f67-9c5b-4a88c90b62ae" (UID: "45f70c00-938c-4f67-9c5b-4a88c90b62ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.861521 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45f70c00-938c-4f67-9c5b-4a88c90b62ae" (UID: "45f70c00-938c-4f67-9c5b-4a88c90b62ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.863221 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv" (OuterVolumeSpecName: "kube-api-access-npsqv") pod "45f70c00-938c-4f67-9c5b-4a88c90b62ae" (UID: "45f70c00-938c-4f67-9c5b-4a88c90b62ae"). InnerVolumeSpecName "kube-api-access-npsqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.929341 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb\") pod \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\" (UID: \"45f70c00-938c-4f67-9c5b-4a88c90b62ae\") " Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.929818 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.929830 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.929839 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npsqv\" (UniqueName: \"kubernetes.io/projected/45f70c00-938c-4f67-9c5b-4a88c90b62ae-kube-api-access-npsqv\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.929849 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:57 crc kubenswrapper[4778]: I0318 09:22:57.958482 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45f70c00-938c-4f67-9c5b-4a88c90b62ae" (UID: "45f70c00-938c-4f67-9c5b-4a88c90b62ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.033058 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f70c00-938c-4f67-9c5b-4a88c90b62ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.262155 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" event={"ID":"0ccbef26-8b5f-4b83-885f-3c074207eb73","Type":"ContainerStarted","Data":"b36c483c2b509d6056dce716a1f9836c549822a2372786d1c313cd9da431b608"} Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.263138 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.264737 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" event={"ID":"45f70c00-938c-4f67-9c5b-4a88c90b62ae","Type":"ContainerDied","Data":"2ad9afed41c247bfddfd9b123a9c47b45acee2887be9ed80d8c3249a1334c51d"} Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.264770 4778 scope.go:117] "RemoveContainer" containerID="31f5aba8b630d38b7c7e2d0b285363c945e3c560fad8314e046d23e6b99ea03e" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.264799 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-m2fhf" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.285995 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" podStartSLOduration=5.28596118 podStartE2EDuration="5.28596118s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:22:58.282356022 +0000 UTC m=+1244.857100872" watchObservedRunningTime="2026-03-18 09:22:58.28596118 +0000 UTC m=+1244.860706030" Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.359588 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:22:58 crc kubenswrapper[4778]: I0318 09:22:58.396593 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-m2fhf"] Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.147809 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.148150 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.148211 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.148872 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.148924 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2" gracePeriod=600 Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.196557 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f70c00-938c-4f67-9c5b-4a88c90b62ae" path="/var/lib/kubelet/pods/45f70c00-938c-4f67-9c5b-4a88c90b62ae/volumes" Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.290909 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2" exitCode=0 Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.291005 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2"} Mar 18 09:23:00 crc kubenswrapper[4778]: I0318 09:23:00.291072 4778 scope.go:117] "RemoveContainer" containerID="ea61f21841c4e8176f6c189293fbb254c950982318bec3d46909c1b0e41c2f38" Mar 18 09:23:01 crc kubenswrapper[4778]: I0318 09:23:01.305501 4778 generic.go:334] "Generic (PLEG): container finished" podID="835f3aad-57da-48e7-ac5a-f0635ee9bc98" containerID="b11383263b8711730419b8679c1b55038dd677778bfced742aebd89591fb64cf" exitCode=0 Mar 18 09:23:01 crc kubenswrapper[4778]: I0318 09:23:01.305943 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pttzb" event={"ID":"835f3aad-57da-48e7-ac5a-f0635ee9bc98","Type":"ContainerDied","Data":"b11383263b8711730419b8679c1b55038dd677778bfced742aebd89591fb64cf"} Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.342032 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.409008 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:23:02 crc kubenswrapper[4778]: E0318 09:23:02.409430 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f70c00-938c-4f67-9c5b-4a88c90b62ae" containerName="init" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.409448 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f70c00-938c-4f67-9c5b-4a88c90b62ae" containerName="init" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.409622 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f70c00-938c-4f67-9c5b-4a88c90b62ae" containerName="init" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.410559 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.416242 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.418838 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.498034 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524350 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524400 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524434 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524550 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpxvx\" (UniqueName: \"kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.524632 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.538260 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-644f48df4-b7jhq"] Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.539695 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.556781 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-644f48df4-b7jhq"] Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.626638 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.627475 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a0a638-c445-4931-861e-d35704487c97-logs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.627522 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.627540 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxvx\" (UniqueName: \"kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.627557 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-config-data\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.627577 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628411 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-combined-ca-bundle\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628517 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-tls-certs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628620 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-secret-key\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628758 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628792 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dx2q\" (UniqueName: \"kubernetes.io/projected/e0a0a638-c445-4931-861e-d35704487c97-kube-api-access-5dx2q\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628830 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.628915 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-scripts\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.629017 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.630030 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.630111 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.649275 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.649377 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.649476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.652452 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpxvx\" (UniqueName: \"kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx\") pod \"horizon-99c8bfc86-rldfg\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-tls-certs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731120 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-secret-key\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731223 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dx2q\" (UniqueName: \"kubernetes.io/projected/e0a0a638-c445-4931-861e-d35704487c97-kube-api-access-5dx2q\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731268 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-scripts\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731344 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a0a638-c445-4931-861e-d35704487c97-logs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731398 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-config-data\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731433 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-combined-ca-bundle\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.731768 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a0a638-c445-4931-861e-d35704487c97-logs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.733181 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-scripts\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.734074 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0a0a638-c445-4931-861e-d35704487c97-config-data\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.736664 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-combined-ca-bundle\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.737989 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-secret-key\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.739511 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a0a638-c445-4931-861e-d35704487c97-horizon-tls-certs\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.755026 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dx2q\" (UniqueName: \"kubernetes.io/projected/e0a0a638-c445-4931-861e-d35704487c97-kube-api-access-5dx2q\") pod \"horizon-644f48df4-b7jhq\" (UID: \"e0a0a638-c445-4931-861e-d35704487c97\") " pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.782160 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:02 crc kubenswrapper[4778]: I0318 09:23:02.858425 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:04 crc kubenswrapper[4778]: I0318 09:23:04.161397 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:23:04 crc kubenswrapper[4778]: I0318 09:23:04.248026 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:23:04 crc kubenswrapper[4778]: I0318 09:23:04.252168 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" containerID="cri-o://28ca12d3a58a03f2fd89d9677d86730c400a702b9f62ca98b238e2b7a92046fd" gracePeriod=10 Mar 18 09:23:05 crc kubenswrapper[4778]: I0318 09:23:05.352386 4778 generic.go:334] "Generic (PLEG): container finished" podID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerID="28ca12d3a58a03f2fd89d9677d86730c400a702b9f62ca98b238e2b7a92046fd" exitCode=0 Mar 18 09:23:05 crc kubenswrapper[4778]: I0318 09:23:05.352458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" event={"ID":"4a490b75-6853-41f7-b5b3-46243c4c2166","Type":"ContainerDied","Data":"28ca12d3a58a03f2fd89d9677d86730c400a702b9f62ca98b238e2b7a92046fd"} Mar 18 09:23:08 crc kubenswrapper[4778]: I0318 09:23:08.561582 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.080864 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.170004 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.171264 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9lph\" (UniqueName: \"kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.171423 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.171487 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.171575 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.171722 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys\") pod \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\" (UID: \"835f3aad-57da-48e7-ac5a-f0635ee9bc98\") " Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.181753 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts" (OuterVolumeSpecName: "scripts") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.182050 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph" (OuterVolumeSpecName: "kube-api-access-x9lph") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "kube-api-access-x9lph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.182131 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.186686 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.199750 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.205775 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data" (OuterVolumeSpecName: "config-data") pod "835f3aad-57da-48e7-ac5a-f0635ee9bc98" (UID: "835f3aad-57da-48e7-ac5a-f0635ee9bc98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274042 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274085 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274096 4778 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274109 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274119 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9lph\" (UniqueName: \"kubernetes.io/projected/835f3aad-57da-48e7-ac5a-f0635ee9bc98-kube-api-access-x9lph\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.274131 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/835f3aad-57da-48e7-ac5a-f0635ee9bc98-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.923242 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pttzb" event={"ID":"835f3aad-57da-48e7-ac5a-f0635ee9bc98","Type":"ContainerDied","Data":"5f8d8cdbe6b7f6fcfb01aff3626cd1ee347d0e895f6332948b61152cbec6e222"} Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.923292 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f8d8cdbe6b7f6fcfb01aff3626cd1ee347d0e895f6332948b61152cbec6e222" Mar 18 09:23:09 crc kubenswrapper[4778]: I0318 09:23:09.923376 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pttzb" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.162919 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pttzb"] Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.174486 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pttzb"] Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.203959 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="835f3aad-57da-48e7-ac5a-f0635ee9bc98" path="/var/lib/kubelet/pods/835f3aad-57da-48e7-ac5a-f0635ee9bc98/volumes" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.266934 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5drhw"] Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.267442 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835f3aad-57da-48e7-ac5a-f0635ee9bc98" containerName="keystone-bootstrap" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.267469 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="835f3aad-57da-48e7-ac5a-f0635ee9bc98" containerName="keystone-bootstrap" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.267700 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="835f3aad-57da-48e7-ac5a-f0635ee9bc98" containerName="keystone-bootstrap" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.268286 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.272531 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.272852 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.273151 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.273396 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.273516 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcgh4" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.281536 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5drhw"] Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300368 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300388 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300475 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjxd\" (UniqueName: \"kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300513 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.300547 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404080 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404125 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404172 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfjxd\" (UniqueName: \"kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404214 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404242 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.404311 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.410743 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.411634 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.414356 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.418636 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.420936 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.425369 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfjxd\" (UniqueName: \"kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd\") pod \"keystone-bootstrap-5drhw\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: I0318 09:23:10.594612 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.960185 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.960597 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s7f7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-2p9jg_openstack(20eafe8e-c0b9-4463-bc12-8c0cd0359968): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.963445 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-2p9jg" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.970550 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.970749 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n685h65h586h67fh5d8h674h65bh54h5c7h7dh9fh659h7ch658h5b7h8bh684h66fh5dh74h68ch645h95h65dh5d6h67ch67h95h5f4h5c5h56fh56cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-psmpb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-568467c8dc-v4vlb_openstack(14a62749-8336-4894-a162-85350096aef4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.973051 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-568467c8dc-v4vlb" podUID="14a62749-8336-4894-a162-85350096aef4" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.992919 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.993129 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n9dh669h688h8h5ddh697hc4h594h644h598h5b7h574h545h89h7chfdh87hf4h579hd5h5b9h688h597h5f4h554h64dh5d9h66bhf6h7fh579h587q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gzl9p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-67f887b5-9vsz7_openstack(f507e196-94ca-4c4a-91f4-de3587084d30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:23:10 crc kubenswrapper[4778]: E0318 09:23:10.996264 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-67f887b5-9vsz7" podUID="f507e196-94ca-4c4a-91f4-de3587084d30" Mar 18 09:23:11 crc kubenswrapper[4778]: E0318 09:23:11.944397 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-2p9jg" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" Mar 18 09:23:13 crc kubenswrapper[4778]: I0318 09:23:13.562154 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Mar 18 09:23:13 crc kubenswrapper[4778]: E0318 09:23:13.737362 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 18 09:23:13 crc kubenswrapper[4778]: E0318 09:23:13.737553 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7gnxr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-98prp_openstack(4135fc20-df28-4f8d-b244-aedd5ed57cc2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:23:13 crc kubenswrapper[4778]: E0318 09:23:13.738735 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-98prp" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" Mar 18 09:23:13 crc kubenswrapper[4778]: E0318 09:23:13.964354 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-98prp" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" Mar 18 09:23:15 crc kubenswrapper[4778]: I0318 09:23:15.981803 4778 generic.go:334] "Generic (PLEG): container finished" podID="4d42905f-c189-4021-834d-f2a81dae5a4a" containerID="abc212acc9fea22cd31581d6e2bb923603370c1ccdef0851c69537c07eedf089" exitCode=0 Mar 18 09:23:15 crc kubenswrapper[4778]: I0318 09:23:15.981976 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zbghp" event={"ID":"4d42905f-c189-4021-834d-f2a81dae5a4a","Type":"ContainerDied","Data":"abc212acc9fea22cd31581d6e2bb923603370c1ccdef0851c69537c07eedf089"} Mar 18 09:23:18 crc kubenswrapper[4778]: I0318 09:23:18.562298 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Mar 18 09:23:18 crc kubenswrapper[4778]: I0318 09:23:18.563015 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.427470 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.438802 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.448616 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zbghp" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.488686 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts\") pod \"14a62749-8336-4894-a162-85350096aef4\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.488798 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs\") pod \"14a62749-8336-4894-a162-85350096aef4\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.488825 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key\") pod \"14a62749-8336-4894-a162-85350096aef4\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.488872 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psmpb\" (UniqueName: \"kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb\") pod \"14a62749-8336-4894-a162-85350096aef4\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.488896 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data\") pod \"14a62749-8336-4894-a162-85350096aef4\" (UID: \"14a62749-8336-4894-a162-85350096aef4\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.490093 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data" (OuterVolumeSpecName: "config-data") pod "14a62749-8336-4894-a162-85350096aef4" (UID: "14a62749-8336-4894-a162-85350096aef4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.490243 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts" (OuterVolumeSpecName: "scripts") pod "14a62749-8336-4894-a162-85350096aef4" (UID: "14a62749-8336-4894-a162-85350096aef4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.490890 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs" (OuterVolumeSpecName: "logs") pod "14a62749-8336-4894-a162-85350096aef4" (UID: "14a62749-8336-4894-a162-85350096aef4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.496962 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb" (OuterVolumeSpecName: "kube-api-access-psmpb") pod "14a62749-8336-4894-a162-85350096aef4" (UID: "14a62749-8336-4894-a162-85350096aef4"). InnerVolumeSpecName "kube-api-access-psmpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.497930 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "14a62749-8336-4894-a162-85350096aef4" (UID: "14a62749-8336-4894-a162-85350096aef4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.590762 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key\") pod \"f507e196-94ca-4c4a-91f4-de3587084d30\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591003 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config\") pod \"4d42905f-c189-4021-834d-f2a81dae5a4a\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts\") pod \"f507e196-94ca-4c4a-91f4-de3587084d30\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591094 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzl9p\" (UniqueName: \"kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p\") pod \"f507e196-94ca-4c4a-91f4-de3587084d30\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591272 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle\") pod \"4d42905f-c189-4021-834d-f2a81dae5a4a\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591304 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcdfb\" (UniqueName: \"kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb\") pod \"4d42905f-c189-4021-834d-f2a81dae5a4a\" (UID: \"4d42905f-c189-4021-834d-f2a81dae5a4a\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591328 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data\") pod \"f507e196-94ca-4c4a-91f4-de3587084d30\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591398 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs\") pod \"f507e196-94ca-4c4a-91f4-de3587084d30\" (UID: \"f507e196-94ca-4c4a-91f4-de3587084d30\") " Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591920 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591946 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a62749-8336-4894-a162-85350096aef4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591957 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14a62749-8336-4894-a162-85350096aef4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591972 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psmpb\" (UniqueName: \"kubernetes.io/projected/14a62749-8336-4894-a162-85350096aef4-kube-api-access-psmpb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.591982 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14a62749-8336-4894-a162-85350096aef4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.593015 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs" (OuterVolumeSpecName: "logs") pod "f507e196-94ca-4c4a-91f4-de3587084d30" (UID: "f507e196-94ca-4c4a-91f4-de3587084d30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.593415 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts" (OuterVolumeSpecName: "scripts") pod "f507e196-94ca-4c4a-91f4-de3587084d30" (UID: "f507e196-94ca-4c4a-91f4-de3587084d30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.597295 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p" (OuterVolumeSpecName: "kube-api-access-gzl9p") pod "f507e196-94ca-4c4a-91f4-de3587084d30" (UID: "f507e196-94ca-4c4a-91f4-de3587084d30"). InnerVolumeSpecName "kube-api-access-gzl9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.597642 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data" (OuterVolumeSpecName: "config-data") pod "f507e196-94ca-4c4a-91f4-de3587084d30" (UID: "f507e196-94ca-4c4a-91f4-de3587084d30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.597877 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb" (OuterVolumeSpecName: "kube-api-access-lcdfb") pod "4d42905f-c189-4021-834d-f2a81dae5a4a" (UID: "4d42905f-c189-4021-834d-f2a81dae5a4a"). InnerVolumeSpecName "kube-api-access-lcdfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.599448 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f507e196-94ca-4c4a-91f4-de3587084d30" (UID: "f507e196-94ca-4c4a-91f4-de3587084d30"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.619142 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d42905f-c189-4021-834d-f2a81dae5a4a" (UID: "4d42905f-c189-4021-834d-f2a81dae5a4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.620275 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config" (OuterVolumeSpecName: "config") pod "4d42905f-c189-4021-834d-f2a81dae5a4a" (UID: "4d42905f-c189-4021-834d-f2a81dae5a4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694243 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694310 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694324 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzl9p\" (UniqueName: \"kubernetes.io/projected/f507e196-94ca-4c4a-91f4-de3587084d30-kube-api-access-gzl9p\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694339 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d42905f-c189-4021-834d-f2a81dae5a4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694350 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcdfb\" (UniqueName: \"kubernetes.io/projected/4d42905f-c189-4021-834d-f2a81dae5a4a-kube-api-access-lcdfb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694358 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f507e196-94ca-4c4a-91f4-de3587084d30-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694366 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f507e196-94ca-4c4a-91f4-de3587084d30-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:22 crc kubenswrapper[4778]: I0318 09:23:22.694377 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f507e196-94ca-4c4a-91f4-de3587084d30-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.054931 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-568467c8dc-v4vlb" event={"ID":"14a62749-8336-4894-a162-85350096aef4","Type":"ContainerDied","Data":"72adaf6b6c1593294d8954a516b806ad25d5b5421c025d726c18c87806d6bbc1"} Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.054959 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568467c8dc-v4vlb" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.060721 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f887b5-9vsz7" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.060734 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67f887b5-9vsz7" event={"ID":"f507e196-94ca-4c4a-91f4-de3587084d30","Type":"ContainerDied","Data":"47668775c33692efd81b7bc7a0bfacb000692e6da118d403a58db7fb6562c979"} Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.070605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zbghp" event={"ID":"4d42905f-c189-4021-834d-f2a81dae5a4a","Type":"ContainerDied","Data":"0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40"} Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.070647 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ae8af7be7deced401f81be079f01e4b10cb21029c529a4299ba7b28d3003f40" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.070691 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zbghp" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.157873 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.167065 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-568467c8dc-v4vlb"] Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.187513 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.194569 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67f887b5-9vsz7"] Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.625525 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.626141 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c9z5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-jb4ss_openstack(fba399d9-71ac-41c3-912f-32ccc7fc6190): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.627355 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-jb4ss" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.789744 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.790692 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d42905f-c189-4021-834d-f2a81dae5a4a" containerName="neutron-db-sync" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.790709 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d42905f-c189-4021-834d-f2a81dae5a4a" containerName="neutron-db-sync" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.790915 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d42905f-c189-4021-834d-f2a81dae5a4a" containerName="neutron-db-sync" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.791851 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.815438 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.867701 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.910288 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.910762 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.910776 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" Mar 18 09:23:23 crc kubenswrapper[4778]: E0318 09:23:23.910786 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="init" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.910792 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="init" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.911004 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.911901 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.915260 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.915787 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.916035 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tvhb7" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.920740 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.927267 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.928316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb\") pod \"4a490b75-6853-41f7-b5b3-46243c4c2166\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.928385 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config\") pod \"4a490b75-6853-41f7-b5b3-46243c4c2166\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.928506 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc\") pod \"4a490b75-6853-41f7-b5b3-46243c4c2166\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.928564 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb\") pod \"4a490b75-6853-41f7-b5b3-46243c4c2166\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.928717 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68vb9\" (UniqueName: \"kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9\") pod \"4a490b75-6853-41f7-b5b3-46243c4c2166\" (UID: \"4a490b75-6853-41f7-b5b3-46243c4c2166\") " Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.932864 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.932918 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.933016 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.933041 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.933081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mdv2\" (UniqueName: \"kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:23 crc kubenswrapper[4778]: I0318 09:23:23.937424 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9" (OuterVolumeSpecName: "kube-api-access-68vb9") pod "4a490b75-6853-41f7-b5b3-46243c4c2166" (UID: "4a490b75-6853-41f7-b5b3-46243c4c2166"). InnerVolumeSpecName "kube-api-access-68vb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034271 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034328 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034354 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034373 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034402 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mdv2\" (UniqueName: \"kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034457 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034491 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034544 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034593 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnk27\" (UniqueName: \"kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.034632 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68vb9\" (UniqueName: \"kubernetes.io/projected/4a490b75-6853-41f7-b5b3-46243c4c2166-kube-api-access-68vb9\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.036112 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.037028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.037062 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.037082 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.062067 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mdv2\" (UniqueName: \"kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2\") pod \"dnsmasq-dns-5f66db59b9-p2knr\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.082307 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerStarted","Data":"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a"} Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.084786 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.085449 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" event={"ID":"4a490b75-6853-41f7-b5b3-46243c4c2166","Type":"ContainerDied","Data":"5ee3b13ece4d9acc78176c498a90576572d6699436ce526238a6b4027ba90016"} Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.085503 4778 scope.go:117] "RemoveContainer" containerID="28ca12d3a58a03f2fd89d9677d86730c400a702b9f62ca98b238e2b7a92046fd" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.091545 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853"} Mar 18 09:23:24 crc kubenswrapper[4778]: E0318 09:23:24.097572 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-jb4ss" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.110831 4778 scope.go:117] "RemoveContainer" containerID="434cdf2be80022d070ec54085d25f3459b4f9eebfed357f2ab22cbeac663278b" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.136993 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.137058 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.137120 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnk27\" (UniqueName: \"kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.137142 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.137173 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.147827 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.153596 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.156344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.176539 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5drhw"] Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.179148 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnk27\" (UniqueName: \"kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.179515 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config\") pod \"neutron-cffc84f44-vtx7x\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.190940 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.206658 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a490b75-6853-41f7-b5b3-46243c4c2166" (UID: "4a490b75-6853-41f7-b5b3-46243c4c2166"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.207171 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a490b75-6853-41f7-b5b3-46243c4c2166" (UID: "4a490b75-6853-41f7-b5b3-46243c4c2166"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.210377 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a490b75-6853-41f7-b5b3-46243c4c2166" (UID: "4a490b75-6853-41f7-b5b3-46243c4c2166"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.217136 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.219139 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config" (OuterVolumeSpecName: "config") pod "4a490b75-6853-41f7-b5b3-46243c4c2166" (UID: "4a490b75-6853-41f7-b5b3-46243c4c2166"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.227655 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a62749-8336-4894-a162-85350096aef4" path="/var/lib/kubelet/pods/14a62749-8336-4894-a162-85350096aef4/volumes" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.228323 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f507e196-94ca-4c4a-91f4-de3587084d30" path="/var/lib/kubelet/pods/f507e196-94ca-4c4a-91f4-de3587084d30/volumes" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.238251 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.239605 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.239636 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.239651 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.239663 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a490b75-6853-41f7-b5b3-46243c4c2166-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.267393 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-644f48df4-b7jhq"] Mar 18 09:23:24 crc kubenswrapper[4778]: W0318 09:23:24.288129 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a0a638_c445_4931_861e_d35704487c97.slice/crio-0fbce59dc7e1c7d23f86e5641e55e03088590e21a72f00c1960f4c85cfb15726 WatchSource:0}: Error finding container 0fbce59dc7e1c7d23f86e5641e55e03088590e21a72f00c1960f4c85cfb15726: Status 404 returned error can't find the container with id 0fbce59dc7e1c7d23f86e5641e55e03088590e21a72f00c1960f4c85cfb15726 Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.385438 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.505076 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.512078 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-npjfc"] Mar 18 09:23:24 crc kubenswrapper[4778]: I0318 09:23:24.798602 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.124866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" event={"ID":"ae2c97b2-c699-443a-b3b3-ecb22de258c2","Type":"ContainerStarted","Data":"548920e3510d285dcfc8978dccbf1745880c114136f2e4109ae6de9a628d0437"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.146268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5drhw" event={"ID":"0fb26926-fc81-4024-a0fa-2363d8703d72","Type":"ContainerStarted","Data":"a9c5d9789a0793c932c41c72d2058c14c6dc506dbd6a4bb8ed76c0353ce8bcc2"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.146337 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5drhw" event={"ID":"0fb26926-fc81-4024-a0fa-2363d8703d72","Type":"ContainerStarted","Data":"2609a43b731af1cf4cb19221492cd2a7688e79a1ecb7ef9c42072e9a7bc39519"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.154378 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerStarted","Data":"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.154534 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7967bcbb45-bl6c8" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon-log" containerID="cri-o://6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a" gracePeriod=30 Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.154782 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7967bcbb45-bl6c8" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon" containerID="cri-o://42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec" gracePeriod=30 Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.173023 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5drhw" podStartSLOduration=15.173000172 podStartE2EDuration="15.173000172s" podCreationTimestamp="2026-03-18 09:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:25.167514612 +0000 UTC m=+1271.742259452" watchObservedRunningTime="2026-03-18 09:23:25.173000172 +0000 UTC m=+1271.747745012" Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.182774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerStarted","Data":"d66d99949ad5120620bffa75ea4a3e49aee89967a841972aa06a4a9867cff673"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.182822 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerStarted","Data":"c77fd4278a90c239273c01a79ef12824477ebf4a1fc89be85a96364b2e982560"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.186450 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644f48df4-b7jhq" event={"ID":"e0a0a638-c445-4931-861e-d35704487c97","Type":"ContainerStarted","Data":"602f232dbc5842b7eda582b514327d9043c8239c9d2c5f6be5f9b82563f5c319"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.186487 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644f48df4-b7jhq" event={"ID":"e0a0a638-c445-4931-861e-d35704487c97","Type":"ContainerStarted","Data":"0fbce59dc7e1c7d23f86e5641e55e03088590e21a72f00c1960f4c85cfb15726"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.188340 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerStarted","Data":"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372"} Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.198434 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7967bcbb45-bl6c8" podStartSLOduration=3.13577453 podStartE2EDuration="30.198418414s" podCreationTimestamp="2026-03-18 09:22:55 +0000 UTC" firstStartedPulling="2026-03-18 09:22:56.480425769 +0000 UTC m=+1243.055170609" lastFinishedPulling="2026-03-18 09:23:23.543069613 +0000 UTC m=+1270.117814493" observedRunningTime="2026-03-18 09:23:25.197491689 +0000 UTC m=+1271.772236539" watchObservedRunningTime="2026-03-18 09:23:25.198418414 +0000 UTC m=+1271.773163254" Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.472167 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:23:25 crc kubenswrapper[4778]: I0318 09:23:25.784453 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.208784 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" path="/var/lib/kubelet/pods/4a490b75-6853-41f7-b5b3-46243c4c2166/volumes" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.217569 4778 generic.go:334] "Generic (PLEG): container finished" podID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerID="a2a49acb877ac3d5291587410f4bdad35a27b8e7dc386fa78d21020a20cbe78c" exitCode=0 Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.217648 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" event={"ID":"ae2c97b2-c699-443a-b3b3-ecb22de258c2","Type":"ContainerDied","Data":"a2a49acb877ac3d5291587410f4bdad35a27b8e7dc386fa78d21020a20cbe78c"} Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.230600 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerStarted","Data":"e444d25da091179b3622d7408ec0b6e7caa7c81b27414dca1d4252c8b3fb5441"} Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.234214 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerStarted","Data":"c97b0574c01538fa1c2084f0fe2463a0e870b56db44f51730acba68dbd0ebf25"} Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.245162 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644f48df4-b7jhq" event={"ID":"e0a0a638-c445-4931-861e-d35704487c97","Type":"ContainerStarted","Data":"338c9caa4dcb4c9e2071a164b7da01c5e6507793b93b5cb80dcaa9454afdb9e2"} Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.289725 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-644f48df4-b7jhq" podStartSLOduration=24.28970075 podStartE2EDuration="24.28970075s" podCreationTimestamp="2026-03-18 09:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:26.27761755 +0000 UTC m=+1272.852362420" watchObservedRunningTime="2026-03-18 09:23:26.28970075 +0000 UTC m=+1272.864445610" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.317378 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-99c8bfc86-rldfg" podStartSLOduration=24.317355063 podStartE2EDuration="24.317355063s" podCreationTimestamp="2026-03-18 09:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:26.30181366 +0000 UTC m=+1272.876558520" watchObservedRunningTime="2026-03-18 09:23:26.317355063 +0000 UTC m=+1272.892099893" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.412332 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.413631 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.423496 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.423911 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.437609 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.519726 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520144 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520170 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520203 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb97b\" (UniqueName: \"kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520259 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.520427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622531 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622595 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622633 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622667 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622695 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb97b\" (UniqueName: \"kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622964 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.622993 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.629850 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.630364 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.632007 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.632528 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.635000 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.638963 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.646871 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb97b\" (UniqueName: \"kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b\") pod \"neutron-56b9647d87-2qhmh\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:26 crc kubenswrapper[4778]: I0318 09:23:26.887859 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.260727 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2p9jg" event={"ID":"20eafe8e-c0b9-4463-bc12-8c0cd0359968","Type":"ContainerStarted","Data":"43cd9dd19c1bcb6dd251052b9f5e2f4fd14b11fc891320649bdbfdb44d9ca171"} Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.269241 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerStarted","Data":"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3"} Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.269283 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerStarted","Data":"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827"} Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.269691 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.273674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerStarted","Data":"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053"} Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.283083 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" event={"ID":"ae2c97b2-c699-443a-b3b3-ecb22de258c2","Type":"ContainerStarted","Data":"14bdd0eaf1dbaf2dd412641a551329bb6a350b27bbb5a0fca3b8cd2009883565"} Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.316354 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2p9jg" podStartSLOduration=3.47383832 podStartE2EDuration="34.316324664s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="2026-03-18 09:22:55.93637278 +0000 UTC m=+1242.511117620" lastFinishedPulling="2026-03-18 09:23:26.778859124 +0000 UTC m=+1273.353603964" observedRunningTime="2026-03-18 09:23:27.282121272 +0000 UTC m=+1273.856866112" watchObservedRunningTime="2026-03-18 09:23:27.316324664 +0000 UTC m=+1273.891069504" Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.317178 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cffc84f44-vtx7x" podStartSLOduration=4.317170947 podStartE2EDuration="4.317170947s" podCreationTimestamp="2026-03-18 09:23:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:27.301729206 +0000 UTC m=+1273.876474056" watchObservedRunningTime="2026-03-18 09:23:27.317170947 +0000 UTC m=+1273.891915787" Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.332041 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" podStartSLOduration=4.332020732 podStartE2EDuration="4.332020732s" podCreationTimestamp="2026-03-18 09:23:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:27.32793358 +0000 UTC m=+1273.902678430" watchObservedRunningTime="2026-03-18 09:23:27.332020732 +0000 UTC m=+1273.906765572" Mar 18 09:23:27 crc kubenswrapper[4778]: I0318 09:23:27.761119 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:23:27 crc kubenswrapper[4778]: W0318 09:23:27.791883 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b12c496_12d8_47e5_8cb7_134c3860368d.slice/crio-e8df8ad489699b62f3719756fb074fcb6bc53de4a98faae3059fd18d69bd24b5 WatchSource:0}: Error finding container e8df8ad489699b62f3719756fb074fcb6bc53de4a98faae3059fd18d69bd24b5: Status 404 returned error can't find the container with id e8df8ad489699b62f3719756fb074fcb6bc53de4a98faae3059fd18d69bd24b5 Mar 18 09:23:28 crc kubenswrapper[4778]: I0318 09:23:28.334182 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerStarted","Data":"1b28dfe7d2202acd51cd4556f1c17c3749cac234f100a94b65f9772c04f97b62"} Mar 18 09:23:28 crc kubenswrapper[4778]: I0318 09:23:28.334728 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:28 crc kubenswrapper[4778]: I0318 09:23:28.334743 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerStarted","Data":"e8df8ad489699b62f3719756fb074fcb6bc53de4a98faae3059fd18d69bd24b5"} Mar 18 09:23:28 crc kubenswrapper[4778]: I0318 09:23:28.566498 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-npjfc" podUID="4a490b75-6853-41f7-b5b3-46243c4c2166" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Mar 18 09:23:29 crc kubenswrapper[4778]: I0318 09:23:29.346254 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerStarted","Data":"97b1e09d6053130fb7ed8cd3a7a1b57d88faf0c15aceaa656cf6b72bf7bed517"} Mar 18 09:23:29 crc kubenswrapper[4778]: I0318 09:23:29.346962 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:29 crc kubenswrapper[4778]: I0318 09:23:29.378011 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56b9647d87-2qhmh" podStartSLOduration=3.377991952 podStartE2EDuration="3.377991952s" podCreationTimestamp="2026-03-18 09:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:29.370771096 +0000 UTC m=+1275.945515936" watchObservedRunningTime="2026-03-18 09:23:29.377991952 +0000 UTC m=+1275.952736792" Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.358304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-98prp" event={"ID":"4135fc20-df28-4f8d-b244-aedd5ed57cc2","Type":"ContainerStarted","Data":"f3f45a8c98da9f26aefaed6f12fcf1ccfeb1c540f04357427bf8f13c5c12ad79"} Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.373720 4778 generic.go:334] "Generic (PLEG): container finished" podID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" containerID="43cd9dd19c1bcb6dd251052b9f5e2f4fd14b11fc891320649bdbfdb44d9ca171" exitCode=0 Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.373804 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2p9jg" event={"ID":"20eafe8e-c0b9-4463-bc12-8c0cd0359968","Type":"ContainerDied","Data":"43cd9dd19c1bcb6dd251052b9f5e2f4fd14b11fc891320649bdbfdb44d9ca171"} Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.388262 4778 generic.go:334] "Generic (PLEG): container finished" podID="0fb26926-fc81-4024-a0fa-2363d8703d72" containerID="a9c5d9789a0793c932c41c72d2058c14c6dc506dbd6a4bb8ed76c0353ce8bcc2" exitCode=0 Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.389038 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5drhw" event={"ID":"0fb26926-fc81-4024-a0fa-2363d8703d72","Type":"ContainerDied","Data":"a9c5d9789a0793c932c41c72d2058c14c6dc506dbd6a4bb8ed76c0353ce8bcc2"} Mar 18 09:23:30 crc kubenswrapper[4778]: I0318 09:23:30.389697 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-98prp" podStartSLOduration=3.631364581 podStartE2EDuration="37.3896886s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="2026-03-18 09:22:55.961414632 +0000 UTC m=+1242.536159472" lastFinishedPulling="2026-03-18 09:23:29.719738651 +0000 UTC m=+1276.294483491" observedRunningTime="2026-03-18 09:23:30.388655382 +0000 UTC m=+1276.963400242" watchObservedRunningTime="2026-03-18 09:23:30.3896886 +0000 UTC m=+1276.964433440" Mar 18 09:23:32 crc kubenswrapper[4778]: I0318 09:23:32.783286 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:32 crc kubenswrapper[4778]: I0318 09:23:32.784363 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:32 crc kubenswrapper[4778]: I0318 09:23:32.860066 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:32 crc kubenswrapper[4778]: I0318 09:23:32.860137 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:34 crc kubenswrapper[4778]: I0318 09:23:34.205010 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:34 crc kubenswrapper[4778]: I0318 09:23:34.317921 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:23:34 crc kubenswrapper[4778]: I0318 09:23:34.318321 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="dnsmasq-dns" containerID="cri-o://b36c483c2b509d6056dce716a1f9836c549822a2372786d1c313cd9da431b608" gracePeriod=10 Mar 18 09:23:34 crc kubenswrapper[4778]: I0318 09:23:34.450061 4778 generic.go:334] "Generic (PLEG): container finished" podID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" containerID="f3f45a8c98da9f26aefaed6f12fcf1ccfeb1c540f04357427bf8f13c5c12ad79" exitCode=0 Mar 18 09:23:34 crc kubenswrapper[4778]: I0318 09:23:34.450245 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-98prp" event={"ID":"4135fc20-df28-4f8d-b244-aedd5ed57cc2","Type":"ContainerDied","Data":"f3f45a8c98da9f26aefaed6f12fcf1ccfeb1c540f04357427bf8f13c5c12ad79"} Mar 18 09:23:35 crc kubenswrapper[4778]: I0318 09:23:35.465908 4778 generic.go:334] "Generic (PLEG): container finished" podID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerID="b36c483c2b509d6056dce716a1f9836c549822a2372786d1c313cd9da431b608" exitCode=0 Mar 18 09:23:35 crc kubenswrapper[4778]: I0318 09:23:35.465990 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" event={"ID":"0ccbef26-8b5f-4b83-885f-3c074207eb73","Type":"ContainerDied","Data":"b36c483c2b509d6056dce716a1f9836c549822a2372786d1c313cd9da431b608"} Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.665747 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2p9jg" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.684540 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-98prp" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731562 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts\") pod \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731622 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle\") pod \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731661 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle\") pod \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731739 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data\") pod \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731809 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7f7g\" (UniqueName: \"kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g\") pod \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731852 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data\") pod \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731884 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gnxr\" (UniqueName: \"kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr\") pod \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\" (UID: \"4135fc20-df28-4f8d-b244-aedd5ed57cc2\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.731913 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs\") pod \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\" (UID: \"20eafe8e-c0b9-4463-bc12-8c0cd0359968\") " Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.733046 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs" (OuterVolumeSpecName: "logs") pod "20eafe8e-c0b9-4463-bc12-8c0cd0359968" (UID: "20eafe8e-c0b9-4463-bc12-8c0cd0359968"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.745274 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr" (OuterVolumeSpecName: "kube-api-access-7gnxr") pod "4135fc20-df28-4f8d-b244-aedd5ed57cc2" (UID: "4135fc20-df28-4f8d-b244-aedd5ed57cc2"). InnerVolumeSpecName "kube-api-access-7gnxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.746739 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4135fc20-df28-4f8d-b244-aedd5ed57cc2" (UID: "4135fc20-df28-4f8d-b244-aedd5ed57cc2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.758422 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts" (OuterVolumeSpecName: "scripts") pod "20eafe8e-c0b9-4463-bc12-8c0cd0359968" (UID: "20eafe8e-c0b9-4463-bc12-8c0cd0359968"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.773189 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g" (OuterVolumeSpecName: "kube-api-access-s7f7g") pod "20eafe8e-c0b9-4463-bc12-8c0cd0359968" (UID: "20eafe8e-c0b9-4463-bc12-8c0cd0359968"). InnerVolumeSpecName "kube-api-access-s7f7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.789092 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4135fc20-df28-4f8d-b244-aedd5ed57cc2" (UID: "4135fc20-df28-4f8d-b244-aedd5ed57cc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.796407 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data" (OuterVolumeSpecName: "config-data") pod "20eafe8e-c0b9-4463-bc12-8c0cd0359968" (UID: "20eafe8e-c0b9-4463-bc12-8c0cd0359968"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837515 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837550 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837561 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837570 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7f7g\" (UniqueName: \"kubernetes.io/projected/20eafe8e-c0b9-4463-bc12-8c0cd0359968-kube-api-access-s7f7g\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837582 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4135fc20-df28-4f8d-b244-aedd5ed57cc2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837591 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gnxr\" (UniqueName: \"kubernetes.io/projected/4135fc20-df28-4f8d-b244-aedd5ed57cc2-kube-api-access-7gnxr\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.837599 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20eafe8e-c0b9-4463-bc12-8c0cd0359968-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.853089 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20eafe8e-c0b9-4463-bc12-8c0cd0359968" (UID: "20eafe8e-c0b9-4463-bc12-8c0cd0359968"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.854842 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.946376 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20eafe8e-c0b9-4463-bc12-8c0cd0359968-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:36 crc kubenswrapper[4778]: I0318 09:23:36.964009 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.052698 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lwww\" (UniqueName: \"kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww\") pod \"0ccbef26-8b5f-4b83-885f-3c074207eb73\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.052753 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb\") pod \"0ccbef26-8b5f-4b83-885f-3c074207eb73\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.052794 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb\") pod \"0ccbef26-8b5f-4b83-885f-3c074207eb73\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.052884 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.052914 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053256 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc\") pod \"0ccbef26-8b5f-4b83-885f-3c074207eb73\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053290 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053331 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053365 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfjxd\" (UniqueName: \"kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053386 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config\") pod \"0ccbef26-8b5f-4b83-885f-3c074207eb73\" (UID: \"0ccbef26-8b5f-4b83-885f-3c074207eb73\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.053400 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle\") pod \"0fb26926-fc81-4024-a0fa-2363d8703d72\" (UID: \"0fb26926-fc81-4024-a0fa-2363d8703d72\") " Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.060130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts" (OuterVolumeSpecName: "scripts") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.060892 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww" (OuterVolumeSpecName: "kube-api-access-4lwww") pod "0ccbef26-8b5f-4b83-885f-3c074207eb73" (UID: "0ccbef26-8b5f-4b83-885f-3c074207eb73"). InnerVolumeSpecName "kube-api-access-4lwww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.063979 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd" (OuterVolumeSpecName: "kube-api-access-cfjxd") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "kube-api-access-cfjxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.079001 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.079189 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.093899 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.094036 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data" (OuterVolumeSpecName: "config-data") pod "0fb26926-fc81-4024-a0fa-2363d8703d72" (UID: "0fb26926-fc81-4024-a0fa-2363d8703d72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.113384 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ccbef26-8b5f-4b83-885f-3c074207eb73" (UID: "0ccbef26-8b5f-4b83-885f-3c074207eb73"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.114145 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ccbef26-8b5f-4b83-885f-3c074207eb73" (UID: "0ccbef26-8b5f-4b83-885f-3c074207eb73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.114897 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ccbef26-8b5f-4b83-885f-3c074207eb73" (UID: "0ccbef26-8b5f-4b83-885f-3c074207eb73"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.116369 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config" (OuterVolumeSpecName: "config") pod "0ccbef26-8b5f-4b83-885f-3c074207eb73" (UID: "0ccbef26-8b5f-4b83-885f-3c074207eb73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154486 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154529 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154541 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154552 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfjxd\" (UniqueName: \"kubernetes.io/projected/0fb26926-fc81-4024-a0fa-2363d8703d72-kube-api-access-cfjxd\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154608 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154620 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154632 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lwww\" (UniqueName: \"kubernetes.io/projected/0ccbef26-8b5f-4b83-885f-3c074207eb73-kube-api-access-4lwww\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154641 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154649 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ccbef26-8b5f-4b83-885f-3c074207eb73-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154660 4778 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.154668 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fb26926-fc81-4024-a0fa-2363d8703d72-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.488886 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" event={"ID":"0ccbef26-8b5f-4b83-885f-3c074207eb73","Type":"ContainerDied","Data":"198149700e09863ba5824c3da59cdfa277dbef6c16734a23ecc5c937b970d9dc"} Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.488946 4778 scope.go:117] "RemoveContainer" containerID="b36c483c2b509d6056dce716a1f9836c549822a2372786d1c313cd9da431b608" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.489087 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-nz24n" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.498790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerStarted","Data":"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4"} Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.502523 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2p9jg" event={"ID":"20eafe8e-c0b9-4463-bc12-8c0cd0359968","Type":"ContainerDied","Data":"f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b"} Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.502573 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2p9jg" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.502577 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3040164c08e605fe47fcb403b7a54ffe6495d3760996218e88d3582673d242b" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.507664 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5drhw" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.507694 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5drhw" event={"ID":"0fb26926-fc81-4024-a0fa-2363d8703d72","Type":"ContainerDied","Data":"2609a43b731af1cf4cb19221492cd2a7688e79a1ecb7ef9c42072e9a7bc39519"} Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.507739 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2609a43b731af1cf4cb19221492cd2a7688e79a1ecb7ef9c42072e9a7bc39519" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.510183 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-98prp" event={"ID":"4135fc20-df28-4f8d-b244-aedd5ed57cc2","Type":"ContainerDied","Data":"49a7b38747f8562cd7a4e6e050cec3d81202329adfd053459a59103fb40b4766"} Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.510244 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49a7b38747f8562cd7a4e6e050cec3d81202329adfd053459a59103fb40b4766" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.510277 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-98prp" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.514619 4778 scope.go:117] "RemoveContainer" containerID="2b8a3d906ce9aeea7f7d488e85e8f58f0a3291f6ef7ed6a10151c8afd47df1d7" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.536054 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.544564 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-nz24n"] Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.849672 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:23:37 crc kubenswrapper[4778]: E0318 09:23:37.852133 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="dnsmasq-dns" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852260 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="dnsmasq-dns" Mar 18 09:23:37 crc kubenswrapper[4778]: E0318 09:23:37.852344 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" containerName="placement-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852403 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" containerName="placement-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: E0318 09:23:37.852459 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" containerName="barbican-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852509 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" containerName="barbican-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: E0318 09:23:37.852568 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="init" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852617 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="init" Mar 18 09:23:37 crc kubenswrapper[4778]: E0318 09:23:37.852670 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb26926-fc81-4024-a0fa-2363d8703d72" containerName="keystone-bootstrap" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852716 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb26926-fc81-4024-a0fa-2363d8703d72" containerName="keystone-bootstrap" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.852941 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" containerName="dnsmasq-dns" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.853017 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" containerName="placement-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.853085 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" containerName="barbican-db-sync" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.853147 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb26926-fc81-4024-a0fa-2363d8703d72" containerName="keystone-bootstrap" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.854424 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.856214 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4lsnj" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.858949 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.871772 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.873991 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.874362 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.875842 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880349 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880388 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880421 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880443 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9cl\" (UniqueName: \"kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880477 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.880583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.982090 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.982928 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.982965 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.983023 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.983067 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.983095 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9cl\" (UniqueName: \"kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.983138 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.986670 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.987870 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.997925 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:37 crc kubenswrapper[4778]: I0318 09:23:37.999831 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.000530 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-75996d8fd4-jhtd2"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.002429 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.025776 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-75996d8fd4-jhtd2"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.035174 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.035684 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.035912 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcgh4" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.035950 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.036084 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.036138 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.036826 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.037248 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.040948 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9cl\" (UniqueName: \"kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl\") pod \"placement-7b57877776-ssjzt\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094105 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-scripts\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094155 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-config-data\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094224 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-fernet-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094250 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-internal-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094355 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-combined-ca-bundle\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094385 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-public-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094453 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44cx2\" (UniqueName: \"kubernetes.io/projected/4c045639-00d0-4ba6-9d75-c67934521e29-kube-api-access-44cx2\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.094474 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-credential-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.134663 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.159696 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.159885 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.187874 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.188392 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-66r47" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.188567 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.195289 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-scripts\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.195355 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.195377 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-config-data\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.195605 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.195637 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-fernet-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.197220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-internal-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.197640 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198000 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-combined-ca-bundle\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198027 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-public-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198262 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksngs\" (UniqueName: \"kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198285 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198501 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44cx2\" (UniqueName: \"kubernetes.io/projected/4c045639-00d0-4ba6-9d75-c67934521e29-kube-api-access-44cx2\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.198756 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-credential-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.214722 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.228102 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-config-data\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.228646 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-internal-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.243185 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-scripts\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.244854 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-public-tls-certs\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.244865 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-credential-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.249338 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-combined-ca-bundle\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.251175 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c045639-00d0-4ba6-9d75-c67934521e29-fernet-keys\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.259898 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44cx2\" (UniqueName: \"kubernetes.io/projected/4c045639-00d0-4ba6-9d75-c67934521e29-kube-api-access-44cx2\") pod \"keystone-75996d8fd4-jhtd2\" (UID: \"4c045639-00d0-4ba6-9d75-c67934521e29\") " pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.262159 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ccbef26-8b5f-4b83-885f-3c074207eb73" path="/var/lib/kubelet/pods/0ccbef26-8b5f-4b83-885f-3c074207eb73/volumes" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.275346 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.297262 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.298045 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.300102 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.313125 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.300147 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.313657 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.313850 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksngs\" (UniqueName: \"kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.313887 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.313967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.323842 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.333724 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.335333 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.335820 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.336909 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.375754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksngs\" (UniqueName: \"kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs\") pod \"barbican-worker-679c749775-5x4dx\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.421322 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422024 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422058 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422094 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422128 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422163 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422184 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422221 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6bt4\" (UniqueName: \"kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422255 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422282 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrb4m\" (UniqueName: \"kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422304 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.422456 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.520659 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.524916 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525008 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrb4m\" (UniqueName: \"kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525048 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525112 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525141 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525189 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525264 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525324 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525358 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.525396 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6bt4\" (UniqueName: \"kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.527819 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.528508 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.528538 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.529342 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.529998 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.542278 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.543959 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.552369 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.575689 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.586627 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.590572 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.596567 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.611255 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6bt4\" (UniqueName: \"kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4\") pod \"barbican-keystone-listener-7c8bdb6c9d-78rbg\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.611600 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrb4m\" (UniqueName: \"kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m\") pod \"dnsmasq-dns-869f779d85-2zrdr\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.639319 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.710755 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.731249 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-769d964c9f-nxhk2"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.732941 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.752832 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753145 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-combined-ca-bundle\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753331 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-logs\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753511 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbwbt\" (UniqueName: \"kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753632 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data-custom\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753734 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753829 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29zfd\" (UniqueName: \"kubernetes.io/projected/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-kube-api-access-29zfd\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.753922 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.754038 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.754135 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.803339 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8bc77f476-tw7vd"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.804912 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.824024 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-769d964c9f-nxhk2"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.862971 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8bc77f476-tw7vd"] Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881024 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a006670-1a48-4421-8471-dd961c0e1d4c-logs\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881095 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-logs\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881175 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsq9c\" (UniqueName: \"kubernetes.io/projected/3a006670-1a48-4421-8471-dd961c0e1d4c-kube-api-access-xsq9c\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881218 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbwbt\" (UniqueName: \"kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881270 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data-custom\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881314 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881349 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29zfd\" (UniqueName: \"kubernetes.io/projected/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-kube-api-access-29zfd\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881432 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881473 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881508 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-combined-ca-bundle\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881571 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881649 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881674 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data-custom\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.881714 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-combined-ca-bundle\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.883513 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-logs\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.887579 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.911428 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data-custom\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.912769 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-combined-ca-bundle\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.926865 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.927893 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.929136 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbwbt\" (UniqueName: \"kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.958792 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data\") pod \"barbican-api-5897f75bc4-n8b2b\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.961980 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-config-data\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:38 crc kubenswrapper[4778]: I0318 09:23:38.962614 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29zfd\" (UniqueName: \"kubernetes.io/projected/eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b-kube-api-access-29zfd\") pod \"barbican-worker-769d964c9f-nxhk2\" (UID: \"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b\") " pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.000026 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-combined-ca-bundle\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.000333 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.000403 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data-custom\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.000586 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a006670-1a48-4421-8471-dd961c0e1d4c-logs\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.000734 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsq9c\" (UniqueName: \"kubernetes.io/projected/3a006670-1a48-4421-8471-dd961c0e1d4c-kube-api-access-xsq9c\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.010842 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-combined-ca-bundle\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.010940 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a006670-1a48-4421-8471-dd961c0e1d4c-logs\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.019325 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data-custom\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.020743 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a006670-1a48-4421-8471-dd961c0e1d4c-config-data\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.023892 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsq9c\" (UniqueName: \"kubernetes.io/projected/3a006670-1a48-4421-8471-dd961c0e1d4c-kube-api-access-xsq9c\") pod \"barbican-keystone-listener-8bc77f476-tw7vd\" (UID: \"3a006670-1a48-4421-8471-dd961c0e1d4c\") " pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.025358 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.027113 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.036665 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.112571 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.163886 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.164322 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-769d964c9f-nxhk2" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.210143 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.210309 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm7dv\" (UniqueName: \"kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.210623 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.210717 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.210773 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.314331 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.314532 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.314600 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.314712 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.314782 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm7dv\" (UniqueName: \"kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.315087 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.335840 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.336934 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.346431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.352105 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm7dv\" (UniqueName: \"kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv\") pod \"barbican-api-df89bff66-xp7n4\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.368633 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.481217 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.678621 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-75996d8fd4-jhtd2"] Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.770510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jb4ss" event={"ID":"fba399d9-71ac-41c3-912f-32ccc7fc6190","Type":"ContainerStarted","Data":"b5325ab7bf5fcc801abe0c67c554c5ab72e440e7503aafe03c45a398c6a12432"} Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.778304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerStarted","Data":"dcaa7760f0d0e632c657a22054c5f006fe1f82143f10b41d8d1cb108f3a1621b"} Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.785932 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-75996d8fd4-jhtd2" event={"ID":"4c045639-00d0-4ba6-9d75-c67934521e29","Type":"ContainerStarted","Data":"7b8cbb9769a8a5897515734bd6020fdbe13f12d735c496630fe08f1100290cca"} Mar 18 09:23:39 crc kubenswrapper[4778]: I0318 09:23:39.805945 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jb4ss" podStartSLOduration=4.984078028 podStartE2EDuration="46.805921381s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="2026-03-18 09:22:55.883754946 +0000 UTC m=+1242.458499786" lastFinishedPulling="2026-03-18 09:23:37.705598299 +0000 UTC m=+1284.280343139" observedRunningTime="2026-03-18 09:23:39.805404096 +0000 UTC m=+1286.380148936" watchObservedRunningTime="2026-03-18 09:23:39.805921381 +0000 UTC m=+1286.380666221" Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.339383 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.352279 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.383163 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.438997 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.463854 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-769d964c9f-nxhk2"] Mar 18 09:23:40 crc kubenswrapper[4778]: W0318 09:23:40.464809 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5e59fa4_b6e8_4091_aedf_46c624304111.slice/crio-59137511add332c1310983740cad93329cdff92e5782f78a0a877c931d4f0027 WatchSource:0}: Error finding container 59137511add332c1310983740cad93329cdff92e5782f78a0a877c931d4f0027: Status 404 returned error can't find the container with id 59137511add332c1310983740cad93329cdff92e5782f78a0a877c931d4f0027 Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.482834 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.641079 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8bc77f476-tw7vd"] Mar 18 09:23:40 crc kubenswrapper[4778]: W0318 09:23:40.679504 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a006670_1a48_4421_8471_dd961c0e1d4c.slice/crio-046678c0e7673a52b9c1716d4f49b4d9a4c9e6c696c20d2a4fec81f855b0e29a WatchSource:0}: Error finding container 046678c0e7673a52b9c1716d4f49b4d9a4c9e6c696c20d2a4fec81f855b0e29a: Status 404 returned error can't find the container with id 046678c0e7673a52b9c1716d4f49b4d9a4c9e6c696c20d2a4fec81f855b0e29a Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.818223 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerStarted","Data":"44a3e56960042fea2cef1afef1f593fd3c5991c7f2d6bb676b183bc24c3832e2"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.822144 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-75996d8fd4-jhtd2" event={"ID":"4c045639-00d0-4ba6-9d75-c67934521e29","Type":"ContainerStarted","Data":"0243bfbf668a6a163b2df6e7d0a281e7c7cec943948afd3c039de9ffc7f7f5fc"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.823565 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.828114 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerStarted","Data":"b165bcb9da7478d0edb4f2c1e6142d197b4e354bbdd69e9690efc132cc26e106"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.835747 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerStarted","Data":"936e07a4b38cfd1b5b28cfe412a7dc4ba1163eceb3ae2617b1ff1b9e372215e3"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.852615 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-75996d8fd4-jhtd2" podStartSLOduration=3.852598651 podStartE2EDuration="3.852598651s" podCreationTimestamp="2026-03-18 09:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:40.84964491 +0000 UTC m=+1287.424389750" watchObservedRunningTime="2026-03-18 09:23:40.852598651 +0000 UTC m=+1287.427343501" Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.875952 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-769d964c9f-nxhk2" event={"ID":"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b","Type":"ContainerStarted","Data":"f7fb335f9e8b41b5f6d83a14f4cf791d4cd409aa6464ca1fb53047a01a82dcdb"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.881489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerStarted","Data":"59137511add332c1310983740cad93329cdff92e5782f78a0a877c931d4f0027"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.884747 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerStarted","Data":"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.884799 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerStarted","Data":"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.884852 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.884910 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.888130 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" event={"ID":"3a006670-1a48-4421-8471-dd961c0e1d4c","Type":"ContainerStarted","Data":"046678c0e7673a52b9c1716d4f49b4d9a4c9e6c696c20d2a4fec81f855b0e29a"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.892656 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" event={"ID":"1cc8be70-e875-4307-89a5-9cbb0d105b86","Type":"ContainerStarted","Data":"cc995beb30e43a9c4d675261cc6a1346b30a1e32f69b240f14fdcb175b6ff16e"} Mar 18 09:23:40 crc kubenswrapper[4778]: I0318 09:23:40.915274 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7b57877776-ssjzt" podStartSLOduration=3.915248988 podStartE2EDuration="3.915248988s" podCreationTimestamp="2026-03-18 09:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:40.903438266 +0000 UTC m=+1287.478183106" watchObservedRunningTime="2026-03-18 09:23:40.915248988 +0000 UTC m=+1287.489993828" Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.909571 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerStarted","Data":"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8"} Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.909897 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerStarted","Data":"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9"} Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.909940 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.909959 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.916060 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerStarted","Data":"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad"} Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.916092 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerStarted","Data":"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33"} Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.916884 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.916912 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.929514 4778 generic.go:334] "Generic (PLEG): container finished" podID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerID="5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8" exitCode=0 Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.930946 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" event={"ID":"1cc8be70-e875-4307-89a5-9cbb0d105b86","Type":"ContainerDied","Data":"5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8"} Mar 18 09:23:41 crc kubenswrapper[4778]: I0318 09:23:41.992502 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.158144 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5897f75bc4-n8b2b" podStartSLOduration=4.158121342 podStartE2EDuration="4.158121342s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:42.036616503 +0000 UTC m=+1288.611361353" watchObservedRunningTime="2026-03-18 09:23:42.158121342 +0000 UTC m=+1288.732866182" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.180305 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84d7458cd-cb86l"] Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.182679 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.188684 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.188893 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.239780 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84d7458cd-cb86l"] Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.281062 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-df89bff66-xp7n4" podStartSLOduration=4.28102277 podStartE2EDuration="4.28102277s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:42.120830947 +0000 UTC m=+1288.695575797" watchObservedRunningTime="2026-03-18 09:23:42.28102277 +0000 UTC m=+1288.855767610" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348716 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-internal-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348772 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data-custom\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348831 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-public-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348865 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8wc5\" (UniqueName: \"kubernetes.io/projected/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-kube-api-access-v8wc5\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348914 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-combined-ca-bundle\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348931 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.348956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-logs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450490 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8wc5\" (UniqueName: \"kubernetes.io/projected/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-kube-api-access-v8wc5\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450562 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-combined-ca-bundle\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450581 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450608 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-logs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450665 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-internal-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450690 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data-custom\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.450736 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-public-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.452907 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-logs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.457783 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-internal-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.457990 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data-custom\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.459844 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-public-tls-certs\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.460386 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-combined-ca-bundle\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.472431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-config-data\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.478371 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8wc5\" (UniqueName: \"kubernetes.io/projected/e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55-kube-api-access-v8wc5\") pod \"barbican-api-84d7458cd-cb86l\" (UID: \"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55\") " pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.541490 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.784570 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.861305 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-644f48df4-b7jhq" podUID="e0a0a638-c445-4931-861e-d35704487c97" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.948551 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" event={"ID":"1cc8be70-e875-4307-89a5-9cbb0d105b86","Type":"ContainerStarted","Data":"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed"} Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.949678 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:42 crc kubenswrapper[4778]: I0318 09:23:42.986595 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" podStartSLOduration=4.986568529 podStartE2EDuration="4.986568529s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:42.979717332 +0000 UTC m=+1289.554462192" watchObservedRunningTime="2026-03-18 09:23:42.986568529 +0000 UTC m=+1289.561313369" Mar 18 09:23:43 crc kubenswrapper[4778]: I0318 09:23:43.908747 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84d7458cd-cb86l"] Mar 18 09:23:43 crc kubenswrapper[4778]: I0318 09:23:43.991476 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-769d964c9f-nxhk2" event={"ID":"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b","Type":"ContainerStarted","Data":"d1cd21f00ad51ee5e120c6ce0764e4fc97e1b7010e1997b14513affbb8ff8d5a"} Mar 18 09:23:43 crc kubenswrapper[4778]: I0318 09:23:43.996323 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerStarted","Data":"e9f7b0df052a4e1bc6dea1dc19739579966633e37f974824883fab15b09a8f39"} Mar 18 09:23:43 crc kubenswrapper[4778]: I0318 09:23:43.999374 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d7458cd-cb86l" event={"ID":"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55","Type":"ContainerStarted","Data":"54f6da0c089b4edf9a7ae0a3b8e1da1139b6c01d35705d878c7cf5873639f7dd"} Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.002155 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" event={"ID":"3a006670-1a48-4421-8471-dd961c0e1d4c","Type":"ContainerStarted","Data":"5ab76e88dc154ab2acf57f179f9ba99e2ed5d45ca1c01ed21a8ae72223921734"} Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.004298 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5897f75bc4-n8b2b" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api-log" containerID="cri-o://edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" gracePeriod=30 Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.004613 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerStarted","Data":"358bb2fa0945b8176753e109d73c3f83c6eac33a4e8d6a66d2281628c5bd5f11"} Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.005853 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5897f75bc4-n8b2b" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api" containerID="cri-o://3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" gracePeriod=30 Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.698635 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.715030 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbwbt\" (UniqueName: \"kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt\") pod \"d5e59fa4-b6e8-4091-aedf-46c624304111\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.715078 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom\") pod \"d5e59fa4-b6e8-4091-aedf-46c624304111\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.715124 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs\") pod \"d5e59fa4-b6e8-4091-aedf-46c624304111\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.715178 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data\") pod \"d5e59fa4-b6e8-4091-aedf-46c624304111\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.715235 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle\") pod \"d5e59fa4-b6e8-4091-aedf-46c624304111\" (UID: \"d5e59fa4-b6e8-4091-aedf-46c624304111\") " Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.717351 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs" (OuterVolumeSpecName: "logs") pod "d5e59fa4-b6e8-4091-aedf-46c624304111" (UID: "d5e59fa4-b6e8-4091-aedf-46c624304111"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.740657 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt" (OuterVolumeSpecName: "kube-api-access-lbwbt") pod "d5e59fa4-b6e8-4091-aedf-46c624304111" (UID: "d5e59fa4-b6e8-4091-aedf-46c624304111"). InnerVolumeSpecName "kube-api-access-lbwbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.740905 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d5e59fa4-b6e8-4091-aedf-46c624304111" (UID: "d5e59fa4-b6e8-4091-aedf-46c624304111"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.771820 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5e59fa4-b6e8-4091-aedf-46c624304111" (UID: "d5e59fa4-b6e8-4091-aedf-46c624304111"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.817879 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbwbt\" (UniqueName: \"kubernetes.io/projected/d5e59fa4-b6e8-4091-aedf-46c624304111-kube-api-access-lbwbt\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.817909 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.817918 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e59fa4-b6e8-4091-aedf-46c624304111-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.817927 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.859300 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data" (OuterVolumeSpecName: "config-data") pod "d5e59fa4-b6e8-4091-aedf-46c624304111" (UID: "d5e59fa4-b6e8-4091-aedf-46c624304111"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:44 crc kubenswrapper[4778]: I0318 09:23:44.918950 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e59fa4-b6e8-4091-aedf-46c624304111-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021630 4778 generic.go:334] "Generic (PLEG): container finished" podID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerID="3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" exitCode=0 Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021677 4778 generic.go:334] "Generic (PLEG): container finished" podID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerID="edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" exitCode=143 Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021732 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerDied","Data":"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021770 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerDied","Data":"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021783 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897f75bc4-n8b2b" event={"ID":"d5e59fa4-b6e8-4091-aedf-46c624304111","Type":"ContainerDied","Data":"59137511add332c1310983740cad93329cdff92e5782f78a0a877c931d4f0027"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021805 4778 scope.go:117] "RemoveContainer" containerID="3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.021965 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897f75bc4-n8b2b" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.042716 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d7458cd-cb86l" event={"ID":"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55","Type":"ContainerStarted","Data":"624c4dffba896b6c96ce6f6e7925b0bf9777ecb41e219b64897938bfc4b24acc"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.042788 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d7458cd-cb86l" event={"ID":"e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55","Type":"ContainerStarted","Data":"e27bba0454b6d4077f23c7beda3af3ad09cca815f9a6d526ef38e1f8e940b7d5"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.044549 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.044581 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.062280 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" event={"ID":"3a006670-1a48-4421-8471-dd961c0e1d4c","Type":"ContainerStarted","Data":"07b54bf16d1a9f269c10f038b41bf94b4dec1c7fb718055d08d33e1f4138193a"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.072551 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerStarted","Data":"e7e9f37ed0458c02870caf56e192acdd710a30f50cce72d6e9f4bfe098996a15"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.096555 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-769d964c9f-nxhk2" event={"ID":"eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b","Type":"ContainerStarted","Data":"2bc045235578db48c48f872727668577d24d6ca472d0fd364cb530018321cec2"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.097607 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84d7458cd-cb86l" podStartSLOduration=4.09757719 podStartE2EDuration="4.09757719s" podCreationTimestamp="2026-03-18 09:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:45.089800299 +0000 UTC m=+1291.664545139" watchObservedRunningTime="2026-03-18 09:23:45.09757719 +0000 UTC m=+1291.672322030" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.105555 4778 scope.go:117] "RemoveContainer" containerID="edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.118879 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerStarted","Data":"fa720a6916b9236597d16b19d9dd408cdda0fb22a3b213c1f2335a3530e41d8d"} Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.120181 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" podStartSLOduration=4.060022811 podStartE2EDuration="7.120159817s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="2026-03-18 09:23:40.398234104 +0000 UTC m=+1286.972978944" lastFinishedPulling="2026-03-18 09:23:43.45837111 +0000 UTC m=+1290.033115950" observedRunningTime="2026-03-18 09:23:45.117576206 +0000 UTC m=+1291.692321036" watchObservedRunningTime="2026-03-18 09:23:45.120159817 +0000 UTC m=+1291.694904657" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.155511 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.156720 4778 scope.go:117] "RemoveContainer" containerID="3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" Mar 18 09:23:45 crc kubenswrapper[4778]: E0318 09:23:45.160605 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8\": container with ID starting with 3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8 not found: ID does not exist" containerID="3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.160655 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8"} err="failed to get container status \"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8\": rpc error: code = NotFound desc = could not find container \"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8\": container with ID starting with 3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8 not found: ID does not exist" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.160685 4778 scope.go:117] "RemoveContainer" containerID="edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" Mar 18 09:23:45 crc kubenswrapper[4778]: E0318 09:23:45.164283 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9\": container with ID starting with edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9 not found: ID does not exist" containerID="edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.164313 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9"} err="failed to get container status \"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9\": rpc error: code = NotFound desc = could not find container \"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9\": container with ID starting with edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9 not found: ID does not exist" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.164333 4778 scope.go:117] "RemoveContainer" containerID="3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.164451 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5897f75bc4-n8b2b"] Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.167544 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8"} err="failed to get container status \"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8\": rpc error: code = NotFound desc = could not find container \"3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8\": container with ID starting with 3a8945dcf895e367af613877277166e29592cd6406ab3ff307e84fddb8b8e2e8 not found: ID does not exist" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.167573 4778 scope.go:117] "RemoveContainer" containerID="edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.173326 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9"} err="failed to get container status \"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9\": rpc error: code = NotFound desc = could not find container \"edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9\": container with ID starting with edf8c5aea9131f9503598f47b513d5ac97995a120f5cad9fdc5428a66ecb1ce9 not found: ID does not exist" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.183425 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8bc77f476-tw7vd" podStartSLOduration=4.417797855 podStartE2EDuration="7.183384228s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="2026-03-18 09:23:40.693562279 +0000 UTC m=+1287.268307119" lastFinishedPulling="2026-03-18 09:23:43.459148652 +0000 UTC m=+1290.033893492" observedRunningTime="2026-03-18 09:23:45.173703895 +0000 UTC m=+1291.748448735" watchObservedRunningTime="2026-03-18 09:23:45.183384228 +0000 UTC m=+1291.758129068" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.212334 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-679c749775-5x4dx" podStartSLOduration=4.117483525 podStartE2EDuration="7.212299415s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="2026-03-18 09:23:40.363653853 +0000 UTC m=+1286.938398693" lastFinishedPulling="2026-03-18 09:23:43.458469733 +0000 UTC m=+1290.033214583" observedRunningTime="2026-03-18 09:23:45.198338905 +0000 UTC m=+1291.773083745" watchObservedRunningTime="2026-03-18 09:23:45.212299415 +0000 UTC m=+1291.787044255" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.318451 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.335219 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-769d964c9f-nxhk2" podStartSLOduration=4.470206522 podStartE2EDuration="7.335167102s" podCreationTimestamp="2026-03-18 09:23:38 +0000 UTC" firstStartedPulling="2026-03-18 09:23:40.557734399 +0000 UTC m=+1287.132479239" lastFinishedPulling="2026-03-18 09:23:43.422694979 +0000 UTC m=+1289.997439819" observedRunningTime="2026-03-18 09:23:45.247788432 +0000 UTC m=+1291.822533282" watchObservedRunningTime="2026-03-18 09:23:45.335167102 +0000 UTC m=+1291.909911952" Mar 18 09:23:45 crc kubenswrapper[4778]: I0318 09:23:45.367970 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:46 crc kubenswrapper[4778]: I0318 09:23:46.135976 4778 generic.go:334] "Generic (PLEG): container finished" podID="fba399d9-71ac-41c3-912f-32ccc7fc6190" containerID="b5325ab7bf5fcc801abe0c67c554c5ab72e440e7503aafe03c45a398c6a12432" exitCode=0 Mar 18 09:23:46 crc kubenswrapper[4778]: I0318 09:23:46.136304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jb4ss" event={"ID":"fba399d9-71ac-41c3-912f-32ccc7fc6190","Type":"ContainerDied","Data":"b5325ab7bf5fcc801abe0c67c554c5ab72e440e7503aafe03c45a398c6a12432"} Mar 18 09:23:46 crc kubenswrapper[4778]: I0318 09:23:46.198973 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" path="/var/lib/kubelet/pods/d5e59fa4-b6e8-4091-aedf-46c624304111/volumes" Mar 18 09:23:47 crc kubenswrapper[4778]: I0318 09:23:47.147348 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener-log" containerID="cri-o://358bb2fa0945b8176753e109d73c3f83c6eac33a4e8d6a66d2281628c5bd5f11" gracePeriod=30 Mar 18 09:23:47 crc kubenswrapper[4778]: I0318 09:23:47.147462 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener" containerID="cri-o://e7e9f37ed0458c02870caf56e192acdd710a30f50cce72d6e9f4bfe098996a15" gracePeriod=30 Mar 18 09:23:47 crc kubenswrapper[4778]: I0318 09:23:47.147852 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-679c749775-5x4dx" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker-log" containerID="cri-o://e9f7b0df052a4e1bc6dea1dc19739579966633e37f974824883fab15b09a8f39" gracePeriod=30 Mar 18 09:23:47 crc kubenswrapper[4778]: I0318 09:23:47.147857 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-679c749775-5x4dx" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker" containerID="cri-o://fa720a6916b9236597d16b19d9dd408cdda0fb22a3b213c1f2335a3530e41d8d" gracePeriod=30 Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.166448 4778 generic.go:334] "Generic (PLEG): container finished" podID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerID="e9f7b0df052a4e1bc6dea1dc19739579966633e37f974824883fab15b09a8f39" exitCode=143 Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.166495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerDied","Data":"e9f7b0df052a4e1bc6dea1dc19739579966633e37f974824883fab15b09a8f39"} Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.176610 4778 generic.go:334] "Generic (PLEG): container finished" podID="3bdaed74-310d-4589-9a14-8c862f05d378" containerID="e7e9f37ed0458c02870caf56e192acdd710a30f50cce72d6e9f4bfe098996a15" exitCode=0 Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.176664 4778 generic.go:334] "Generic (PLEG): container finished" podID="3bdaed74-310d-4589-9a14-8c862f05d378" containerID="358bb2fa0945b8176753e109d73c3f83c6eac33a4e8d6a66d2281628c5bd5f11" exitCode=143 Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.176676 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerDied","Data":"e7e9f37ed0458c02870caf56e192acdd710a30f50cce72d6e9f4bfe098996a15"} Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.176739 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerDied","Data":"358bb2fa0945b8176753e109d73c3f83c6eac33a4e8d6a66d2281628c5bd5f11"} Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.713458 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.793120 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:48 crc kubenswrapper[4778]: I0318 09:23:48.793543 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" containerID="cri-o://14bdd0eaf1dbaf2dd412641a551329bb6a350b27bbb5a0fca3b8cd2009883565" gracePeriod=10 Mar 18 09:23:49 crc kubenswrapper[4778]: I0318 09:23:49.193633 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Mar 18 09:23:49 crc kubenswrapper[4778]: I0318 09:23:49.197435 4778 generic.go:334] "Generic (PLEG): container finished" podID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerID="fa720a6916b9236597d16b19d9dd408cdda0fb22a3b213c1f2335a3530e41d8d" exitCode=0 Mar 18 09:23:49 crc kubenswrapper[4778]: I0318 09:23:49.197513 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerDied","Data":"fa720a6916b9236597d16b19d9dd408cdda0fb22a3b213c1f2335a3530e41d8d"} Mar 18 09:23:49 crc kubenswrapper[4778]: I0318 09:23:49.213188 4778 generic.go:334] "Generic (PLEG): container finished" podID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerID="14bdd0eaf1dbaf2dd412641a551329bb6a350b27bbb5a0fca3b8cd2009883565" exitCode=0 Mar 18 09:23:49 crc kubenswrapper[4778]: I0318 09:23:49.213294 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" event={"ID":"ae2c97b2-c699-443a-b3b3-ecb22de258c2","Type":"ContainerDied","Data":"14bdd0eaf1dbaf2dd412641a551329bb6a350b27bbb5a0fca3b8cd2009883565"} Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.137327 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.156838 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.156942 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9z5l\" (UniqueName: \"kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157050 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157132 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157241 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157300 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts\") pod \"fba399d9-71ac-41c3-912f-32ccc7fc6190\" (UID: \"fba399d9-71ac-41c3-912f-32ccc7fc6190\") " Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157618 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.157873 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fba399d9-71ac-41c3-912f-32ccc7fc6190-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.169241 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.173391 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l" (OuterVolumeSpecName: "kube-api-access-c9z5l") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "kube-api-access-c9z5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.182985 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts" (OuterVolumeSpecName: "scripts") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.216521 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.232874 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jb4ss" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.248843 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data" (OuterVolumeSpecName: "config-data") pod "fba399d9-71ac-41c3-912f-32ccc7fc6190" (UID: "fba399d9-71ac-41c3-912f-32ccc7fc6190"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.260108 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9z5l\" (UniqueName: \"kubernetes.io/projected/fba399d9-71ac-41c3-912f-32ccc7fc6190-kube-api-access-c9z5l\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.260131 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.260142 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.260151 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.260160 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba399d9-71ac-41c3-912f-32ccc7fc6190-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.318068 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jb4ss" event={"ID":"fba399d9-71ac-41c3-912f-32ccc7fc6190","Type":"ContainerDied","Data":"fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50"} Mar 18 09:23:50 crc kubenswrapper[4778]: I0318 09:23:50.318110 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fef33d3c7f1855b88b205e58a953db1eba3e6979fcc1430ebec66414bb961b50" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.381350 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:23:51 crc kubenswrapper[4778]: E0318 09:23:51.381750 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api-log" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.381762 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api-log" Mar 18 09:23:51 crc kubenswrapper[4778]: E0318 09:23:51.381778 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.381784 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api" Mar 18 09:23:51 crc kubenswrapper[4778]: E0318 09:23:51.381797 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" containerName="cinder-db-sync" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.381802 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" containerName="cinder-db-sync" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.381981 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api-log" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.382010 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e59fa4-b6e8-4091-aedf-46c624304111" containerName="barbican-api" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.382017 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" containerName="cinder-db-sync" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.382944 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.387439 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.387661 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.388383 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.388516 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r86tv" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.400540 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.461012 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.477853 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.479330 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491606 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491711 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491741 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491764 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.491806 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mnd9\" (UniqueName: \"kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.498955 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.541053 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595063 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595468 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595492 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595517 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595559 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595582 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hck6z\" (UniqueName: \"kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595627 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.595676 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mnd9\" (UniqueName: \"kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.603346 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.607042 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.607618 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.613221 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.616857 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mnd9\" (UniqueName: \"kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.634732 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts\") pod \"cinder-scheduler-0\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.696848 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.696910 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.696966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hck6z\" (UniqueName: \"kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.697046 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.697072 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.698967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.699557 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.700349 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.703559 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.711666 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.715754 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.717750 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.730439 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hck6z\" (UniqueName: \"kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z\") pod \"dnsmasq-dns-58db5546cc-9xln9\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.736109 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.761778 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800314 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800367 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800387 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vxxv\" (UniqueName: \"kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800451 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.800529 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.833807 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902190 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902241 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902281 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vxxv\" (UniqueName: \"kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902300 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902438 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.902462 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.903516 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.903577 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.907081 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.907108 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.909776 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.914140 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:51 crc kubenswrapper[4778]: I0318 09:23:51.924770 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vxxv\" (UniqueName: \"kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv\") pod \"cinder-api-0\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " pod="openstack/cinder-api-0" Mar 18 09:23:52 crc kubenswrapper[4778]: I0318 09:23:52.115190 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:23:53 crc kubenswrapper[4778]: I0318 09:23:53.647388 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.257444 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.264688 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.601254 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.601785 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56b9647d87-2qhmh" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-api" containerID="cri-o://1b28dfe7d2202acd51cd4556f1c17c3749cac234f100a94b65f9772c04f97b62" gracePeriod=30 Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.602036 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56b9647d87-2qhmh" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" containerID="cri-o://97b1e09d6053130fb7ed8cd3a7a1b57d88faf0c15aceaa656cf6b72bf7bed517" gracePeriod=30 Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.617835 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-56b9647d87-2qhmh" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9696/\": read tcp 10.217.0.2:37418->10.217.0.151:9696: read: connection reset by peer" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.646022 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d979499f7-4flxt"] Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.647884 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.698095 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d979499f7-4flxt"] Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.745374 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84d7458cd-cb86l" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.763334 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.763421 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-public-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.763691 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nntv4\" (UniqueName: \"kubernetes.io/projected/da263057-3652-4ae8-8435-4f80e4b13804-kube-api-access-nntv4\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.764835 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-httpd-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.765942 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-combined-ca-bundle\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.766093 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-ovndb-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.766185 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-internal-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869146 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869219 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-public-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869304 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nntv4\" (UniqueName: \"kubernetes.io/projected/da263057-3652-4ae8-8435-4f80e4b13804-kube-api-access-nntv4\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869326 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-httpd-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869396 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-combined-ca-bundle\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869416 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-ovndb-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.869437 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-internal-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.889276 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.913163 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-httpd-config\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.914324 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-public-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.914819 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-internal-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.915049 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-ovndb-tls-certs\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.915587 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.915770 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-df89bff66-xp7n4" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api-log" containerID="cri-o://3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33" gracePeriod=30 Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.916158 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-df89bff66-xp7n4" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api" containerID="cri-o://f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad" gracePeriod=30 Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.932127 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da263057-3652-4ae8-8435-4f80e4b13804-combined-ca-bundle\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:54 crc kubenswrapper[4778]: I0318 09:23:54.966784 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nntv4\" (UniqueName: \"kubernetes.io/projected/da263057-3652-4ae8-8435-4f80e4b13804-kube-api-access-nntv4\") pod \"neutron-6d979499f7-4flxt\" (UID: \"da263057-3652-4ae8-8435-4f80e4b13804\") " pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.329141 4778 generic.go:334] "Generic (PLEG): container finished" podID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerID="3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33" exitCode=143 Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.329756 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerDied","Data":"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33"} Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.355230 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" event={"ID":"ae2c97b2-c699-443a-b3b3-ecb22de258c2","Type":"ContainerDied","Data":"548920e3510d285dcfc8978dccbf1745880c114136f2e4109ae6de9a628d0437"} Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.355274 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="548920e3510d285dcfc8978dccbf1745880c114136f2e4109ae6de9a628d0437" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.356854 4778 generic.go:334] "Generic (PLEG): container finished" podID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerID="6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a" exitCode=137 Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.356874 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerDied","Data":"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a"} Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.662459 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.666957 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.680493 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.713160 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.805752 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc\") pod \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.806329 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb\") pod \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.806419 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config\") pod \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.806610 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mdv2\" (UniqueName: \"kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2\") pod \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.806643 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb\") pod \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\" (UID: \"ae2c97b2-c699-443a-b3b3-ecb22de258c2\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.829472 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2" (OuterVolumeSpecName: "kube-api-access-8mdv2") pod "ae2c97b2-c699-443a-b3b3-ecb22de258c2" (UID: "ae2c97b2-c699-443a-b3b3-ecb22de258c2"). InnerVolumeSpecName "kube-api-access-8mdv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.908340 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs\") pod \"3bdaed74-310d-4589-9a14-8c862f05d378\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.908481 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data\") pod \"3bdaed74-310d-4589-9a14-8c862f05d378\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.908855 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs" (OuterVolumeSpecName: "logs") pod "3bdaed74-310d-4589-9a14-8c862f05d378" (UID: "3bdaed74-310d-4589-9a14-8c862f05d378"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.908973 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle\") pod \"3bdaed74-310d-4589-9a14-8c862f05d378\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.909437 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom\") pod \"3bdaed74-310d-4589-9a14-8c862f05d378\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.910212 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6bt4\" (UniqueName: \"kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4\") pod \"3bdaed74-310d-4589-9a14-8c862f05d378\" (UID: \"3bdaed74-310d-4589-9a14-8c862f05d378\") " Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.912171 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mdv2\" (UniqueName: \"kubernetes.io/projected/ae2c97b2-c699-443a-b3b3-ecb22de258c2-kube-api-access-8mdv2\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.912455 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bdaed74-310d-4589-9a14-8c862f05d378-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.933519 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3bdaed74-310d-4589-9a14-8c862f05d378" (UID: "3bdaed74-310d-4589-9a14-8c862f05d378"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:55 crc kubenswrapper[4778]: I0318 09:23:55.989957 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4" (OuterVolumeSpecName: "kube-api-access-h6bt4") pod "3bdaed74-310d-4589-9a14-8c862f05d378" (UID: "3bdaed74-310d-4589-9a14-8c862f05d378"). InnerVolumeSpecName "kube-api-access-h6bt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.016547 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.016870 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6bt4\" (UniqueName: \"kubernetes.io/projected/3bdaed74-310d-4589-9a14-8c862f05d378-kube-api-access-h6bt4\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.045180 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae2c97b2-c699-443a-b3b3-ecb22de258c2" (UID: "ae2c97b2-c699-443a-b3b3-ecb22de258c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.048740 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ae2c97b2-c699-443a-b3b3-ecb22de258c2" (UID: "ae2c97b2-c699-443a-b3b3-ecb22de258c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.050781 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ae2c97b2-c699-443a-b3b3-ecb22de258c2" (UID: "ae2c97b2-c699-443a-b3b3-ecb22de258c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.053162 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.061365 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config" (OuterVolumeSpecName: "config") pod "ae2c97b2-c699-443a-b3b3-ecb22de258c2" (UID: "ae2c97b2-c699-443a-b3b3-ecb22de258c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.071327 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data" (OuterVolumeSpecName: "config-data") pod "3bdaed74-310d-4589-9a14-8c862f05d378" (UID: "3bdaed74-310d-4589-9a14-8c862f05d378"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.103776 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bdaed74-310d-4589-9a14-8c862f05d378" (UID: "3bdaed74-310d-4589-9a14-8c862f05d378"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.111331 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.121736 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data\") pod \"2290443f-8279-4b62-9d3d-bab0be1d7af5\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.121832 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksngs\" (UniqueName: \"kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs\") pod \"2290443f-8279-4b62-9d3d-bab0be1d7af5\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.121878 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle\") pod \"2290443f-8279-4b62-9d3d-bab0be1d7af5\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.121901 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs\") pod \"2290443f-8279-4b62-9d3d-bab0be1d7af5\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.121936 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom\") pod \"2290443f-8279-4b62-9d3d-bab0be1d7af5\" (UID: \"2290443f-8279-4b62-9d3d-bab0be1d7af5\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122183 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122212 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122223 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122232 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdaed74-310d-4589-9a14-8c862f05d378-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122241 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.122251 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae2c97b2-c699-443a-b3b3-ecb22de258c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.125477 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs" (OuterVolumeSpecName: "logs") pod "2290443f-8279-4b62-9d3d-bab0be1d7af5" (UID: "2290443f-8279-4b62-9d3d-bab0be1d7af5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.149080 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2290443f-8279-4b62-9d3d-bab0be1d7af5" (UID: "2290443f-8279-4b62-9d3d-bab0be1d7af5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.164815 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs" (OuterVolumeSpecName: "kube-api-access-ksngs") pod "2290443f-8279-4b62-9d3d-bab0be1d7af5" (UID: "2290443f-8279-4b62-9d3d-bab0be1d7af5"). InnerVolumeSpecName "kube-api-access-ksngs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.179613 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.231599 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksngs\" (UniqueName: \"kubernetes.io/projected/2290443f-8279-4b62-9d3d-bab0be1d7af5-kube-api-access-ksngs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.231636 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2290443f-8279-4b62-9d3d-bab0be1d7af5-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.231646 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.233481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.253150 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.318270 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2290443f-8279-4b62-9d3d-bab0be1d7af5" (UID: "2290443f-8279-4b62-9d3d-bab0be1d7af5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.338340 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.349523 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.376456 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data" (OuterVolumeSpecName: "config-data") pod "2290443f-8279-4b62-9d3d-bab0be1d7af5" (UID: "2290443f-8279-4b62-9d3d-bab0be1d7af5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.407714 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.407961 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg" event={"ID":"3bdaed74-310d-4589-9a14-8c862f05d378","Type":"ContainerDied","Data":"936e07a4b38cfd1b5b28cfe412a7dc4ba1163eceb3ae2617b1ff1b9e372215e3"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.408102 4778 scope.go:117] "RemoveContainer" containerID="e7e9f37ed0458c02870caf56e192acdd710a30f50cce72d6e9f4bfe098996a15" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.409873 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerStarted","Data":"3dd48f50ba7628f9f7d698453fd23e9975283871dd516f7a6292391921e1577b"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.423810 4778 generic.go:334] "Generic (PLEG): container finished" podID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerID="42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec" exitCode=137 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.423868 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerDied","Data":"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.423889 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7967bcbb45-bl6c8" event={"ID":"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2","Type":"ContainerDied","Data":"0cbfa7c707107fef7fba1585919c6375feaacdc7b31da820c5b759e556860c71"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.424266 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7967bcbb45-bl6c8" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.427020 4778 generic.go:334] "Generic (PLEG): container finished" podID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerID="97b1e09d6053130fb7ed8cd3a7a1b57d88faf0c15aceaa656cf6b72bf7bed517" exitCode=0 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.427348 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerDied","Data":"97b1e09d6053130fb7ed8cd3a7a1b57d88faf0c15aceaa656cf6b72bf7bed517"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.438743 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679c749775-5x4dx" event={"ID":"2290443f-8279-4b62-9d3d-bab0be1d7af5","Type":"ContainerDied","Data":"44a3e56960042fea2cef1afef1f593fd3c5991c7f2d6bb676b183bc24c3832e2"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.439087 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679c749775-5x4dx" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.455729 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerStarted","Data":"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.455914 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-central-agent" containerID="cri-o://7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372" gracePeriod=30 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.456185 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.456446 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="proxy-httpd" containerID="cri-o://e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9" gracePeriod=30 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.456490 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="sg-core" containerID="cri-o://19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4" gracePeriod=30 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.456525 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-notification-agent" containerID="cri-o://d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053" gracePeriod=30 Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.466688 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2290443f-8279-4b62-9d3d-bab0be1d7af5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.468024 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerStarted","Data":"158c8e2542c97971367812456984ca4e3f98182f67d2f9c5b6c2354ec14b4a85"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.470220 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.470943 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" event={"ID":"06a6d934-2f47-4628-a328-6ba9cefb8090","Type":"ContainerStarted","Data":"ff4d7228da7d0afd7376861dde18209e6f8e60848f76f48755825c7cbc2d2227"} Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.477487 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.479774 4778 scope.go:117] "RemoveContainer" containerID="358bb2fa0945b8176753e109d73c3f83c6eac33a4e8d6a66d2281628c5bd5f11" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.504273 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-7c8bdb6c9d-78rbg"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.528936 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.274576901 podStartE2EDuration="1m3.528907091s" podCreationTimestamp="2026-03-18 09:22:53 +0000 UTC" firstStartedPulling="2026-03-18 09:22:55.961141974 +0000 UTC m=+1242.535886814" lastFinishedPulling="2026-03-18 09:23:55.215472164 +0000 UTC m=+1301.790217004" observedRunningTime="2026-03-18 09:23:56.481066688 +0000 UTC m=+1303.055811578" watchObservedRunningTime="2026-03-18 09:23:56.528907091 +0000 UTC m=+1303.103651931" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.538143 4778 scope.go:117] "RemoveContainer" containerID="42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.575901 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.578383 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.578603 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.578805 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.578873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.578905 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l9tz\" (UniqueName: \"kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.579081 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs" (OuterVolumeSpecName: "logs") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.579431 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.585409 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.612453 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz" (OuterVolumeSpecName: "kube-api-access-8l9tz") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "kube-api-access-8l9tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.657138 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-679c749775-5x4dx"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.680894 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data" (OuterVolumeSpecName: "config-data") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.681931 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") pod \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\" (UID: \"56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2\") " Mar 18 09:23:56 crc kubenswrapper[4778]: W0318 09:23:56.683481 4778 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2/volumes/kubernetes.io~configmap/config-data Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.683518 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data" (OuterVolumeSpecName: "config-data") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.686218 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts" (OuterVolumeSpecName: "scripts") pod "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" (UID: "56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.689303 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.689339 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.689353 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.689369 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l9tz\" (UniqueName: \"kubernetes.io/projected/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2-kube-api-access-8l9tz\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.694249 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.723550 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-p2knr"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.727917 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d979499f7-4flxt"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.785933 4778 scope.go:117] "RemoveContainer" containerID="6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.894348 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-56b9647d87-2qhmh" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9696/\": dial tcp 10.217.0.151:9696: connect: connection refused" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.967108 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.978373 4778 scope.go:117] "RemoveContainer" containerID="42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec" Mar 18 09:23:56 crc kubenswrapper[4778]: E0318 09:23:56.980923 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec\": container with ID starting with 42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec not found: ID does not exist" containerID="42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.980964 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec"} err="failed to get container status \"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec\": rpc error: code = NotFound desc = could not find container \"42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec\": container with ID starting with 42133d09cf6c414e848760029a4c8dfa2c8938ed8795dcc6e60aafeb5e5434ec not found: ID does not exist" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.980989 4778 scope.go:117] "RemoveContainer" containerID="6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a" Mar 18 09:23:56 crc kubenswrapper[4778]: E0318 09:23:56.981470 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a\": container with ID starting with 6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a not found: ID does not exist" containerID="6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.981501 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a"} err="failed to get container status \"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a\": rpc error: code = NotFound desc = could not find container \"6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a\": container with ID starting with 6cc9eca806b1a23bc1d1b294f18078aee42407c96258fd0c073d024bd427b05a not found: ID does not exist" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.981518 4778 scope.go:117] "RemoveContainer" containerID="fa720a6916b9236597d16b19d9dd408cdda0fb22a3b213c1f2335a3530e41d8d" Mar 18 09:23:56 crc kubenswrapper[4778]: I0318 09:23:56.992521 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7967bcbb45-bl6c8"] Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.191486 4778 scope.go:117] "RemoveContainer" containerID="e9f7b0df052a4e1bc6dea1dc19739579966633e37f974824883fab15b09a8f39" Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.553551 4778 generic.go:334] "Generic (PLEG): container finished" podID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerID="19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4" exitCode=2 Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.553777 4778 generic.go:334] "Generic (PLEG): container finished" podID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerID="7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372" exitCode=0 Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.553818 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerDied","Data":"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.553845 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerDied","Data":"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.561114 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerStarted","Data":"825b48041531c5b0c01f329609e8aff59582830809964a4de416fba30a539f48"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.563404 4778 generic.go:334] "Generic (PLEG): container finished" podID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerID="9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6" exitCode=0 Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.563467 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" event={"ID":"06a6d934-2f47-4628-a328-6ba9cefb8090","Type":"ContainerDied","Data":"9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.580022 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d979499f7-4flxt" event={"ID":"da263057-3652-4ae8-8435-4f80e4b13804","Type":"ContainerStarted","Data":"3d0673c880513eabd90c660c0aca8f21e04bcaa48ea7b76a58c69bafee70f7d5"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.580256 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d979499f7-4flxt" event={"ID":"da263057-3652-4ae8-8435-4f80e4b13804","Type":"ContainerStarted","Data":"be750524dac8eb7fa21fef767ce16f355c71ddf39141f72d801e5166941cb488"} Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.580389 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:23:57 crc kubenswrapper[4778]: I0318 09:23:57.629349 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d979499f7-4flxt" podStartSLOduration=3.629325946 podStartE2EDuration="3.629325946s" podCreationTimestamp="2026-03-18 09:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:57.614679297 +0000 UTC m=+1304.189424137" watchObservedRunningTime="2026-03-18 09:23:57.629325946 +0000 UTC m=+1304.204070786" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.231351 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" path="/var/lib/kubelet/pods/2290443f-8279-4b62-9d3d-bab0be1d7af5/volumes" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.241502 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" path="/var/lib/kubelet/pods/3bdaed74-310d-4589-9a14-8c862f05d378/volumes" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.242412 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" path="/var/lib/kubelet/pods/56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2/volumes" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.242978 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" path="/var/lib/kubelet/pods/ae2c97b2-c699-443a-b3b3-ecb22de258c2/volumes" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.533680 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.635414 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerStarted","Data":"c901fa502e83a3c43053baeb3fceb6b85a02d87dc36f812db5898d2cee7e9935"} Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.635746 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api-log" containerID="cri-o://825b48041531c5b0c01f329609e8aff59582830809964a4de416fba30a539f48" gracePeriod=30 Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.638563 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api" containerID="cri-o://c901fa502e83a3c43053baeb3fceb6b85a02d87dc36f812db5898d2cee7e9935" gracePeriod=30 Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.638778 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.644405 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" event={"ID":"06a6d934-2f47-4628-a328-6ba9cefb8090","Type":"ContainerStarted","Data":"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42"} Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.645632 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.649881 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d979499f7-4flxt" event={"ID":"da263057-3652-4ae8-8435-4f80e4b13804","Type":"ContainerStarted","Data":"4303b578c6acd544585106fa3ebfc66574b0ed2f544b92af5efb19b9d55f68b4"} Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.654484 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerStarted","Data":"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785"} Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.683407 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.683371457 podStartE2EDuration="7.683371457s" podCreationTimestamp="2026-03-18 09:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:58.681653761 +0000 UTC m=+1305.256398621" watchObservedRunningTime="2026-03-18 09:23:58.683371457 +0000 UTC m=+1305.258116297" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.725953 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" podStartSLOduration=7.725930876 podStartE2EDuration="7.725930876s" podCreationTimestamp="2026-03-18 09:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:23:58.711634476 +0000 UTC m=+1305.286379316" watchObservedRunningTime="2026-03-18 09:23:58.725930876 +0000 UTC m=+1305.300675716" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.865758 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-644f48df4-b7jhq" Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.931210 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.931543 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon-log" containerID="cri-o://d66d99949ad5120620bffa75ea4a3e49aee89967a841972aa06a4a9867cff673" gracePeriod=30 Mar 18 09:23:58 crc kubenswrapper[4778]: I0318 09:23:58.931720 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" containerID="cri-o://c97b0574c01538fa1c2084f0fe2463a0e870b56db44f51730acba68dbd0ebf25" gracePeriod=30 Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.194283 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f66db59b9-p2knr" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: i/o timeout" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.361884 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.463359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm7dv\" (UniqueName: \"kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv\") pod \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.464799 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom\") pod \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.464850 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle\") pod \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.464906 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data\") pod \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.464971 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs\") pod \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\" (UID: \"48e51d3c-7f9f-4196-b708-c28c0e7477fa\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.465769 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs" (OuterVolumeSpecName: "logs") pod "48e51d3c-7f9f-4196-b708-c28c0e7477fa" (UID: "48e51d3c-7f9f-4196-b708-c28c0e7477fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.475298 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "48e51d3c-7f9f-4196-b708-c28c0e7477fa" (UID: "48e51d3c-7f9f-4196-b708-c28c0e7477fa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.475508 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv" (OuterVolumeSpecName: "kube-api-access-qm7dv") pod "48e51d3c-7f9f-4196-b708-c28c0e7477fa" (UID: "48e51d3c-7f9f-4196-b708-c28c0e7477fa"). InnerVolumeSpecName "kube-api-access-qm7dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.515506 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48e51d3c-7f9f-4196-b708-c28c0e7477fa" (UID: "48e51d3c-7f9f-4196-b708-c28c0e7477fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.544425 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data" (OuterVolumeSpecName: "config-data") pod "48e51d3c-7f9f-4196-b708-c28c0e7477fa" (UID: "48e51d3c-7f9f-4196-b708-c28c0e7477fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.567145 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm7dv\" (UniqueName: \"kubernetes.io/projected/48e51d3c-7f9f-4196-b708-c28c0e7477fa-kube-api-access-qm7dv\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.567182 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.567225 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.567239 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e51d3c-7f9f-4196-b708-c28c0e7477fa-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.567250 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48e51d3c-7f9f-4196-b708-c28c0e7477fa-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.670912 4778 generic.go:334] "Generic (PLEG): container finished" podID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerID="825b48041531c5b0c01f329609e8aff59582830809964a4de416fba30a539f48" exitCode=143 Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.670997 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerDied","Data":"825b48041531c5b0c01f329609e8aff59582830809964a4de416fba30a539f48"} Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.673959 4778 generic.go:334] "Generic (PLEG): container finished" podID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerID="f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad" exitCode=0 Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.674021 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerDied","Data":"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad"} Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.674049 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df89bff66-xp7n4" event={"ID":"48e51d3c-7f9f-4196-b708-c28c0e7477fa","Type":"ContainerDied","Data":"b165bcb9da7478d0edb4f2c1e6142d197b4e354bbdd69e9690efc132cc26e106"} Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.674068 4778 scope.go:117] "RemoveContainer" containerID="f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.674242 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-df89bff66-xp7n4" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.679403 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerStarted","Data":"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78"} Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.692699 4778 generic.go:334] "Generic (PLEG): container finished" podID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerID="1b28dfe7d2202acd51cd4556f1c17c3749cac234f100a94b65f9772c04f97b62" exitCode=0 Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.693576 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerDied","Data":"1b28dfe7d2202acd51cd4556f1c17c3749cac234f100a94b65f9772c04f97b62"} Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.717769 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.71291037 podStartE2EDuration="8.717752322s" podCreationTimestamp="2026-03-18 09:23:51 +0000 UTC" firstStartedPulling="2026-03-18 09:23:56.252563733 +0000 UTC m=+1302.827308573" lastFinishedPulling="2026-03-18 09:23:57.257405685 +0000 UTC m=+1303.832150525" observedRunningTime="2026-03-18 09:23:59.707092463 +0000 UTC m=+1306.281837323" watchObservedRunningTime="2026-03-18 09:23:59.717752322 +0000 UTC m=+1306.292497162" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.735973 4778 scope.go:117] "RemoveContainer" containerID="3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.741694 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.762336 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-df89bff66-xp7n4"] Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.824273 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.833729 4778 scope.go:117] "RemoveContainer" containerID="f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad" Mar 18 09:23:59 crc kubenswrapper[4778]: E0318 09:23:59.834262 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad\": container with ID starting with f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad not found: ID does not exist" containerID="f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.834309 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad"} err="failed to get container status \"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad\": rpc error: code = NotFound desc = could not find container \"f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad\": container with ID starting with f30b63fd1fdc15aa3c6037c7434036ae380963be528cd71b9127115381dc61ad not found: ID does not exist" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.834335 4778 scope.go:117] "RemoveContainer" containerID="3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33" Mar 18 09:23:59 crc kubenswrapper[4778]: E0318 09:23:59.834701 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33\": container with ID starting with 3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33 not found: ID does not exist" containerID="3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.834733 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33"} err="failed to get container status \"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33\": rpc error: code = NotFound desc = could not find container \"3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33\": container with ID starting with 3edcc9f4f9357bab5eee053a665bdedaa54dbb3bf17d99933f4ed64fa544df33 not found: ID does not exist" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.880624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.880707 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.880756 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.880891 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.880962 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.881006 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.881054 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb97b\" (UniqueName: \"kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b\") pod \"2b12c496-12d8-47e5-8cb7-134c3860368d\" (UID: \"2b12c496-12d8-47e5-8cb7-134c3860368d\") " Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.892469 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b" (OuterVolumeSpecName: "kube-api-access-qb97b") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "kube-api-access-qb97b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.900757 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.936895 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.957040 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config" (OuterVolumeSpecName: "config") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.959772 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.968419 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983808 4778 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983840 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb97b\" (UniqueName: \"kubernetes.io/projected/2b12c496-12d8-47e5-8cb7-134c3860368d-kube-api-access-qb97b\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983851 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983860 4778 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983869 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.983878 4778 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:23:59 crc kubenswrapper[4778]: I0318 09:23:59.992443 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2b12c496-12d8-47e5-8cb7-134c3860368d" (UID: "2b12c496-12d8-47e5-8cb7-134c3860368d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.085222 4778 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b12c496-12d8-47e5-8cb7-134c3860368d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145513 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563764-s4crc"] Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145840 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145857 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145870 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-api" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145876 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-api" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145888 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145894 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145908 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145915 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145928 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145933 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon-log" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145944 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145979 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.145992 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.145998 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.146011 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="init" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146018 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="init" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.146028 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146034 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener-log" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.146043 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146049 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api-log" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.146059 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146066 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker" Mar 18 09:24:00 crc kubenswrapper[4778]: E0318 09:24:00.146074 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146079 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146278 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146293 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae2c97b2-c699-443a-b3b3-ecb22de258c2" containerName="dnsmasq-dns" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146303 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146313 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146321 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-httpd" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146330 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146342 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" containerName="neutron-api" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146353 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" containerName="barbican-api-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146364 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bdaed74-310d-4589-9a14-8c862f05d378" containerName="barbican-keystone-listener" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146374 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a6e416-3b49-4f07-a2ac-7fd1a4f58fe2" containerName="horizon-log" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146383 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2290443f-8279-4b62-9d3d-bab0be1d7af5" containerName="barbican-worker" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.146916 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.149438 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.151571 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.151924 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.170471 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-s4crc"] Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.187111 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlmf5\" (UniqueName: \"kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5\") pod \"auto-csr-approver-29563764-s4crc\" (UID: \"bb3b2d75-fc85-48dc-8533-18ecd8c75187\") " pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.200410 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e51d3c-7f9f-4196-b708-c28c0e7477fa" path="/var/lib/kubelet/pods/48e51d3c-7f9f-4196-b708-c28c0e7477fa/volumes" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.289347 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlmf5\" (UniqueName: \"kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5\") pod \"auto-csr-approver-29563764-s4crc\" (UID: \"bb3b2d75-fc85-48dc-8533-18ecd8c75187\") " pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.307841 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlmf5\" (UniqueName: \"kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5\") pod \"auto-csr-approver-29563764-s4crc\" (UID: \"bb3b2d75-fc85-48dc-8533-18ecd8c75187\") " pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.465531 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.718907 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b9647d87-2qhmh" event={"ID":"2b12c496-12d8-47e5-8cb7-134c3860368d","Type":"ContainerDied","Data":"e8df8ad489699b62f3719756fb074fcb6bc53de4a98faae3059fd18d69bd24b5"} Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.719393 4778 scope.go:117] "RemoveContainer" containerID="97b1e09d6053130fb7ed8cd3a7a1b57d88faf0c15aceaa656cf6b72bf7bed517" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.720273 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b9647d87-2qhmh" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.784503 4778 scope.go:117] "RemoveContainer" containerID="1b28dfe7d2202acd51cd4556f1c17c3749cac234f100a94b65f9772c04f97b62" Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.785893 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.794089 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56b9647d87-2qhmh"] Mar 18 09:24:00 crc kubenswrapper[4778]: I0318 09:24:00.928380 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-s4crc"] Mar 18 09:24:00 crc kubenswrapper[4778]: W0318 09:24:00.936383 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb3b2d75_fc85_48dc_8533_18ecd8c75187.slice/crio-fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95 WatchSource:0}: Error finding container fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95: Status 404 returned error can't find the container with id fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95 Mar 18 09:24:01 crc kubenswrapper[4778]: I0318 09:24:01.712677 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 09:24:01 crc kubenswrapper[4778]: I0318 09:24:01.731210 4778 generic.go:334] "Generic (PLEG): container finished" podID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerID="d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053" exitCode=0 Mar 18 09:24:01 crc kubenswrapper[4778]: I0318 09:24:01.731300 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerDied","Data":"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053"} Mar 18 09:24:01 crc kubenswrapper[4778]: I0318 09:24:01.732491 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563764-s4crc" event={"ID":"bb3b2d75-fc85-48dc-8533-18ecd8c75187","Type":"ContainerStarted","Data":"fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95"} Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.198972 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b12c496-12d8-47e5-8cb7-134c3860368d" path="/var/lib/kubelet/pods/2b12c496-12d8-47e5-8cb7-134c3860368d/volumes" Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.744305 4778 generic.go:334] "Generic (PLEG): container finished" podID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerID="c97b0574c01538fa1c2084f0fe2463a0e870b56db44f51730acba68dbd0ebf25" exitCode=0 Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.744656 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerDied","Data":"c97b0574c01538fa1c2084f0fe2463a0e870b56db44f51730acba68dbd0ebf25"} Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.746729 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563764-s4crc" event={"ID":"bb3b2d75-fc85-48dc-8533-18ecd8c75187","Type":"ContainerStarted","Data":"2ec42f2618fc279e3b11e295de9307609aec937968481eb47ae40bf89eeec176"} Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.780544 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563764-s4crc" podStartSLOduration=1.514659989 podStartE2EDuration="2.78051597s" podCreationTimestamp="2026-03-18 09:24:00 +0000 UTC" firstStartedPulling="2026-03-18 09:24:00.940768897 +0000 UTC m=+1307.515513737" lastFinishedPulling="2026-03-18 09:24:02.206624878 +0000 UTC m=+1308.781369718" observedRunningTime="2026-03-18 09:24:02.764146574 +0000 UTC m=+1309.338891454" watchObservedRunningTime="2026-03-18 09:24:02.78051597 +0000 UTC m=+1309.355260840" Mar 18 09:24:02 crc kubenswrapper[4778]: I0318 09:24:02.783345 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Mar 18 09:24:03 crc kubenswrapper[4778]: I0318 09:24:03.756730 4778 generic.go:334] "Generic (PLEG): container finished" podID="bb3b2d75-fc85-48dc-8533-18ecd8c75187" containerID="2ec42f2618fc279e3b11e295de9307609aec937968481eb47ae40bf89eeec176" exitCode=0 Mar 18 09:24:03 crc kubenswrapper[4778]: I0318 09:24:03.756837 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563764-s4crc" event={"ID":"bb3b2d75-fc85-48dc-8533-18ecd8c75187","Type":"ContainerDied","Data":"2ec42f2618fc279e3b11e295de9307609aec937968481eb47ae40bf89eeec176"} Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.208292 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.301160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlmf5\" (UniqueName: \"kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5\") pod \"bb3b2d75-fc85-48dc-8533-18ecd8c75187\" (UID: \"bb3b2d75-fc85-48dc-8533-18ecd8c75187\") " Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.306628 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5" (OuterVolumeSpecName: "kube-api-access-xlmf5") pod "bb3b2d75-fc85-48dc-8533-18ecd8c75187" (UID: "bb3b2d75-fc85-48dc-8533-18ecd8c75187"). InnerVolumeSpecName "kube-api-access-xlmf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.403649 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlmf5\" (UniqueName: \"kubernetes.io/projected/bb3b2d75-fc85-48dc-8533-18ecd8c75187-kube-api-access-xlmf5\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.780815 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563764-s4crc" event={"ID":"bb3b2d75-fc85-48dc-8533-18ecd8c75187","Type":"ContainerDied","Data":"fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95"} Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.780851 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbfc405090c6b9b3a4cbae242297821fa7e208a6f4adf0dc492b8aad2c4c1a95" Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.781185 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-s4crc" Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.866479 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-zslfz"] Mar 18 09:24:05 crc kubenswrapper[4778]: I0318 09:24:05.883822 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-zslfz"] Mar 18 09:24:06 crc kubenswrapper[4778]: I0318 09:24:06.205839 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70faab0-9f07-4452-a873-bcb59d28b7a8" path="/var/lib/kubelet/pods/c70faab0-9f07-4452-a873-bcb59d28b7a8/volumes" Mar 18 09:24:06 crc kubenswrapper[4778]: I0318 09:24:06.836375 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:24:06 crc kubenswrapper[4778]: I0318 09:24:06.903005 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:24:06 crc kubenswrapper[4778]: I0318 09:24:06.903295 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="dnsmasq-dns" containerID="cri-o://9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed" gracePeriod=10 Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.045273 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.096428 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.487262 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.546833 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb\") pod \"1cc8be70-e875-4307-89a5-9cbb0d105b86\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.547043 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb\") pod \"1cc8be70-e875-4307-89a5-9cbb0d105b86\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.547190 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config\") pod \"1cc8be70-e875-4307-89a5-9cbb0d105b86\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.547276 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrb4m\" (UniqueName: \"kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m\") pod \"1cc8be70-e875-4307-89a5-9cbb0d105b86\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.548277 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc\") pod \"1cc8be70-e875-4307-89a5-9cbb0d105b86\" (UID: \"1cc8be70-e875-4307-89a5-9cbb0d105b86\") " Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.558881 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m" (OuterVolumeSpecName: "kube-api-access-nrb4m") pod "1cc8be70-e875-4307-89a5-9cbb0d105b86" (UID: "1cc8be70-e875-4307-89a5-9cbb0d105b86"). InnerVolumeSpecName "kube-api-access-nrb4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.650958 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrb4m\" (UniqueName: \"kubernetes.io/projected/1cc8be70-e875-4307-89a5-9cbb0d105b86-kube-api-access-nrb4m\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.735829 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1cc8be70-e875-4307-89a5-9cbb0d105b86" (UID: "1cc8be70-e875-4307-89a5-9cbb0d105b86"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.735851 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1cc8be70-e875-4307-89a5-9cbb0d105b86" (UID: "1cc8be70-e875-4307-89a5-9cbb0d105b86"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.736275 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config" (OuterVolumeSpecName: "config") pod "1cc8be70-e875-4307-89a5-9cbb0d105b86" (UID: "1cc8be70-e875-4307-89a5-9cbb0d105b86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.736934 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1cc8be70-e875-4307-89a5-9cbb0d105b86" (UID: "1cc8be70-e875-4307-89a5-9cbb0d105b86"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.752901 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.753158 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.753291 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.753387 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc8be70-e875-4307-89a5-9cbb0d105b86-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.802089 4778 generic.go:334] "Generic (PLEG): container finished" podID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerID="9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed" exitCode=0 Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.802450 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="cinder-scheduler" containerID="cri-o://33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785" gracePeriod=30 Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.802879 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.805295 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" event={"ID":"1cc8be70-e875-4307-89a5-9cbb0d105b86","Type":"ContainerDied","Data":"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed"} Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.805352 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-2zrdr" event={"ID":"1cc8be70-e875-4307-89a5-9cbb0d105b86","Type":"ContainerDied","Data":"cc995beb30e43a9c4d675261cc6a1346b30a1e32f69b240f14fdcb175b6ff16e"} Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.805381 4778 scope.go:117] "RemoveContainer" containerID="9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.805933 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="probe" containerID="cri-o://7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78" gracePeriod=30 Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.828985 4778 scope.go:117] "RemoveContainer" containerID="5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.837895 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.847189 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-2zrdr"] Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.863566 4778 scope.go:117] "RemoveContainer" containerID="9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed" Mar 18 09:24:07 crc kubenswrapper[4778]: E0318 09:24:07.864029 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed\": container with ID starting with 9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed not found: ID does not exist" containerID="9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.864067 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed"} err="failed to get container status \"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed\": rpc error: code = NotFound desc = could not find container \"9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed\": container with ID starting with 9732a15f6cfa53664a150348a7bbb90cea76e3636e83f66175a5fa374e8313ed not found: ID does not exist" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.864094 4778 scope.go:117] "RemoveContainer" containerID="5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8" Mar 18 09:24:07 crc kubenswrapper[4778]: E0318 09:24:07.864470 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8\": container with ID starting with 5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8 not found: ID does not exist" containerID="5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8" Mar 18 09:24:07 crc kubenswrapper[4778]: I0318 09:24:07.864489 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8"} err="failed to get container status \"5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8\": rpc error: code = NotFound desc = could not find container \"5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8\": container with ID starting with 5cebfdfe9a748d2abaa31efb6ca9c6b6deab1fa9c65a1678fd03dbf02573b1a8 not found: ID does not exist" Mar 18 09:24:08 crc kubenswrapper[4778]: I0318 09:24:08.198629 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" path="/var/lib/kubelet/pods/1cc8be70-e875-4307-89a5-9cbb0d105b86/volumes" Mar 18 09:24:08 crc kubenswrapper[4778]: I0318 09:24:08.815995 4778 generic.go:334] "Generic (PLEG): container finished" podID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerID="7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78" exitCode=0 Mar 18 09:24:08 crc kubenswrapper[4778]: I0318 09:24:08.816064 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerDied","Data":"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78"} Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.091032 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.407784 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.462612 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.699424 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7588d8786-t6x7l"] Mar 18 09:24:09 crc kubenswrapper[4778]: E0318 09:24:09.699782 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3b2d75-fc85-48dc-8533-18ecd8c75187" containerName="oc" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.699798 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3b2d75-fc85-48dc-8533-18ecd8c75187" containerName="oc" Mar 18 09:24:09 crc kubenswrapper[4778]: E0318 09:24:09.699812 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="init" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.699820 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="init" Mar 18 09:24:09 crc kubenswrapper[4778]: E0318 09:24:09.699835 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="dnsmasq-dns" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.699843 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="dnsmasq-dns" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.699992 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc8be70-e875-4307-89a5-9cbb0d105b86" containerName="dnsmasq-dns" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.700007 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3b2d75-fc85-48dc-8533-18ecd8c75187" containerName="oc" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.700833 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.774091 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7588d8786-t6x7l"] Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794234 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqf7m\" (UniqueName: \"kubernetes.io/projected/fe0de426-6927-42ea-8b29-8bc01c27fe69-kube-api-access-qqf7m\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794290 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-internal-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794413 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0de426-6927-42ea-8b29-8bc01c27fe69-logs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794447 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-combined-ca-bundle\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794483 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-config-data\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794505 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-public-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.794543 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-scripts\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.896602 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-config-data\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.896669 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-public-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.896745 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-scripts\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.896809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqf7m\" (UniqueName: \"kubernetes.io/projected/fe0de426-6927-42ea-8b29-8bc01c27fe69-kube-api-access-qqf7m\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.896845 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-internal-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.898340 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0de426-6927-42ea-8b29-8bc01c27fe69-logs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.898451 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-combined-ca-bundle\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.898732 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0de426-6927-42ea-8b29-8bc01c27fe69-logs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.903482 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-internal-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.903951 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-public-tls-certs\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.904836 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-combined-ca-bundle\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.910070 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-config-data\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.910069 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe0de426-6927-42ea-8b29-8bc01c27fe69-scripts\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:09 crc kubenswrapper[4778]: I0318 09:24:09.922085 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqf7m\" (UniqueName: \"kubernetes.io/projected/fe0de426-6927-42ea-8b29-8bc01c27fe69-kube-api-access-qqf7m\") pod \"placement-7588d8786-t6x7l\" (UID: \"fe0de426-6927-42ea-8b29-8bc01c27fe69\") " pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.036851 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.259667 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-75996d8fd4-jhtd2" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.623028 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7588d8786-t6x7l"] Mar 18 09:24:10 crc kubenswrapper[4778]: W0318 09:24:10.631471 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe0de426_6927_42ea_8b29_8bc01c27fe69.slice/crio-e3cc921221ed90cdcc3c47fb2bda02a8ea0bdcdd4808d3e2a3081aecc5ad2388 WatchSource:0}: Error finding container e3cc921221ed90cdcc3c47fb2bda02a8ea0bdcdd4808d3e2a3081aecc5ad2388: Status 404 returned error can't find the container with id e3cc921221ed90cdcc3c47fb2bda02a8ea0bdcdd4808d3e2a3081aecc5ad2388 Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.635275 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.719837 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.719896 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mnd9\" (UniqueName: \"kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.719954 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.719979 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.720122 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.720155 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id\") pod \"25c30dcf-f49d-430b-a240-aefe036afeeb\" (UID: \"25c30dcf-f49d-430b-a240-aefe036afeeb\") " Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.720662 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.725139 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts" (OuterVolumeSpecName: "scripts") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.725246 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.725714 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9" (OuterVolumeSpecName: "kube-api-access-5mnd9") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "kube-api-access-5mnd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.771067 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.822176 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mnd9\" (UniqueName: \"kubernetes.io/projected/25c30dcf-f49d-430b-a240-aefe036afeeb-kube-api-access-5mnd9\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.822223 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.822234 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.822248 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.822258 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c30dcf-f49d-430b-a240-aefe036afeeb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.824100 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data" (OuterVolumeSpecName: "config-data") pod "25c30dcf-f49d-430b-a240-aefe036afeeb" (UID: "25c30dcf-f49d-430b-a240-aefe036afeeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.834160 4778 generic.go:334] "Generic (PLEG): container finished" podID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerID="33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785" exitCode=0 Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.834252 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerDied","Data":"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785"} Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.834277 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.834286 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"25c30dcf-f49d-430b-a240-aefe036afeeb","Type":"ContainerDied","Data":"3dd48f50ba7628f9f7d698453fd23e9975283871dd516f7a6292391921e1577b"} Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.834307 4778 scope.go:117] "RemoveContainer" containerID="7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.839358 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7588d8786-t6x7l" event={"ID":"fe0de426-6927-42ea-8b29-8bc01c27fe69","Type":"ContainerStarted","Data":"e3cc921221ed90cdcc3c47fb2bda02a8ea0bdcdd4808d3e2a3081aecc5ad2388"} Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.865541 4778 scope.go:117] "RemoveContainer" containerID="33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.870109 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.880722 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.896547 4778 scope.go:117] "RemoveContainer" containerID="7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78" Mar 18 09:24:10 crc kubenswrapper[4778]: E0318 09:24:10.897179 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78\": container with ID starting with 7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78 not found: ID does not exist" containerID="7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.897237 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78"} err="failed to get container status \"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78\": rpc error: code = NotFound desc = could not find container \"7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78\": container with ID starting with 7894ead3a987d3ec160c1170d9dcd2de9b3ee0bbfc989cdbac0f12d90d8bcb78 not found: ID does not exist" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.897264 4778 scope.go:117] "RemoveContainer" containerID="33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785" Mar 18 09:24:10 crc kubenswrapper[4778]: E0318 09:24:10.897555 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785\": container with ID starting with 33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785 not found: ID does not exist" containerID="33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.897586 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785"} err="failed to get container status \"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785\": rpc error: code = NotFound desc = could not find container \"33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785\": container with ID starting with 33166af78a795095ea023f48794e4d162e1003fc0e1bbb5cef5817e3cc899785 not found: ID does not exist" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.903452 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:10 crc kubenswrapper[4778]: E0318 09:24:10.904344 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="probe" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.904374 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="probe" Mar 18 09:24:10 crc kubenswrapper[4778]: E0318 09:24:10.904404 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="cinder-scheduler" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.904414 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="cinder-scheduler" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.904666 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="probe" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.904730 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" containerName="cinder-scheduler" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.906410 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.909436 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.924941 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.924966 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.925437 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-scripts\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.925640 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.925736 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bbde13ad-dacc-4f17-8da3-109ede6972c0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.925842 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6vkn\" (UniqueName: \"kubernetes.io/projected/bbde13ad-dacc-4f17-8da3-109ede6972c0-kube-api-access-p6vkn\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.925919 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:10 crc kubenswrapper[4778]: I0318 09:24:10.926039 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c30dcf-f49d-430b-a240-aefe036afeeb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.027593 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028137 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bbde13ad-dacc-4f17-8da3-109ede6972c0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028233 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6vkn\" (UniqueName: \"kubernetes.io/projected/bbde13ad-dacc-4f17-8da3-109ede6972c0-kube-api-access-p6vkn\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028267 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028345 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028387 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bbde13ad-dacc-4f17-8da3-109ede6972c0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.028399 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-scripts\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.031326 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.032516 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-scripts\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.033861 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.034066 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbde13ad-dacc-4f17-8da3-109ede6972c0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.048750 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6vkn\" (UniqueName: \"kubernetes.io/projected/bbde13ad-dacc-4f17-8da3-109ede6972c0-kube-api-access-p6vkn\") pod \"cinder-scheduler-0\" (UID: \"bbde13ad-dacc-4f17-8da3-109ede6972c0\") " pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.154686 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.455259 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.853957 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bbde13ad-dacc-4f17-8da3-109ede6972c0","Type":"ContainerStarted","Data":"b68a2f1177e8618f5371c42c181f2a2a72c3ec262ce9c4255d1091a4422376d1"} Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.859820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7588d8786-t6x7l" event={"ID":"fe0de426-6927-42ea-8b29-8bc01c27fe69","Type":"ContainerStarted","Data":"70f41166643e5c57059949763b954cffb1995b94972c432b8661eaffd305357c"} Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.859850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7588d8786-t6x7l" event={"ID":"fe0de426-6927-42ea-8b29-8bc01c27fe69","Type":"ContainerStarted","Data":"e42aa40ce87ef108a835e400ee7d0af65765c83a28c90ec93577dc2879e7039c"} Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.860424 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:11 crc kubenswrapper[4778]: I0318 09:24:11.890570 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7588d8786-t6x7l" podStartSLOduration=2.890442807 podStartE2EDuration="2.890442807s" podCreationTimestamp="2026-03-18 09:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:24:11.883673602 +0000 UTC m=+1318.458418452" watchObservedRunningTime="2026-03-18 09:24:11.890442807 +0000 UTC m=+1318.465187657" Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.219684 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c30dcf-f49d-430b-a240-aefe036afeeb" path="/var/lib/kubelet/pods/25c30dcf-f49d-430b-a240-aefe036afeeb/volumes" Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.783288 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.885970 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bbde13ad-dacc-4f17-8da3-109ede6972c0","Type":"ContainerStarted","Data":"b1277af22eaff22c034718b40d19fb0722f73ea78fc1e330fcacc9be3f0bbfc1"} Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.886037 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bbde13ad-dacc-4f17-8da3-109ede6972c0","Type":"ContainerStarted","Data":"5bc9419baaf6a9eba429b9f4b0e19c573040ff85ad0dec56617879e900848eec"} Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.886120 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:12 crc kubenswrapper[4778]: I0318 09:24:12.902769 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.902749161 podStartE2EDuration="2.902749161s" podCreationTimestamp="2026-03-18 09:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:24:12.90091239 +0000 UTC m=+1319.475657230" watchObservedRunningTime="2026-03-18 09:24:12.902749161 +0000 UTC m=+1319.477494001" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.280131 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.282410 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.287913 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.288114 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-vqzw8" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.288284 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.299186 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.318251 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl5jq\" (UniqueName: \"kubernetes.io/projected/fec302c3-e5fc-4019-b4f5-50de6bdde59f-kube-api-access-hl5jq\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.318299 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.318335 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.318413 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config-secret\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.420487 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl5jq\" (UniqueName: \"kubernetes.io/projected/fec302c3-e5fc-4019-b4f5-50de6bdde59f-kube-api-access-hl5jq\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.420547 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.420596 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.420697 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config-secret\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.421845 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.429019 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.429893 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fec302c3-e5fc-4019-b4f5-50de6bdde59f-openstack-config-secret\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.439765 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl5jq\" (UniqueName: \"kubernetes.io/projected/fec302c3-e5fc-4019-b4f5-50de6bdde59f-kube-api-access-hl5jq\") pod \"openstackclient\" (UID: \"fec302c3-e5fc-4019-b4f5-50de6bdde59f\") " pod="openstack/openstackclient" Mar 18 09:24:14 crc kubenswrapper[4778]: I0318 09:24:14.632832 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 09:24:15 crc kubenswrapper[4778]: I0318 09:24:15.131693 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 09:24:15 crc kubenswrapper[4778]: I0318 09:24:15.933675 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fec302c3-e5fc-4019-b4f5-50de6bdde59f","Type":"ContainerStarted","Data":"374277d6bb2e47d21523d2e4fedd07b943036d6c14091bdefdbcc052923e0497"} Mar 18 09:24:16 crc kubenswrapper[4778]: I0318 09:24:16.155289 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 09:24:21 crc kubenswrapper[4778]: I0318 09:24:21.381528 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 09:24:22 crc kubenswrapper[4778]: I0318 09:24:22.783056 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-99c8bfc86-rldfg" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Mar 18 09:24:22 crc kubenswrapper[4778]: I0318 09:24:22.783151 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:24:24 crc kubenswrapper[4778]: I0318 09:24:24.298770 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 09:24:25 crc kubenswrapper[4778]: I0318 09:24:25.677886 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d979499f7-4flxt" Mar 18 09:24:25 crc kubenswrapper[4778]: I0318 09:24:25.762475 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:24:25 crc kubenswrapper[4778]: I0318 09:24:25.762700 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cffc84f44-vtx7x" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-api" containerID="cri-o://5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827" gracePeriod=30 Mar 18 09:24:25 crc kubenswrapper[4778]: I0318 09:24:25.763108 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cffc84f44-vtx7x" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-httpd" containerID="cri-o://6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3" gracePeriod=30 Mar 18 09:24:26 crc kubenswrapper[4778]: I0318 09:24:26.035870 4778 generic.go:334] "Generic (PLEG): container finished" podID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerID="6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3" exitCode=0 Mar 18 09:24:26 crc kubenswrapper[4778]: I0318 09:24:26.035924 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerDied","Data":"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3"} Mar 18 09:24:26 crc kubenswrapper[4778]: I0318 09:24:26.037481 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fec302c3-e5fc-4019-b4f5-50de6bdde59f","Type":"ContainerStarted","Data":"f1cac91237d6d6c46aeb8af41803efc6b4b3f1b77c65947a4636db5680599973"} Mar 18 09:24:26 crc kubenswrapper[4778]: I0318 09:24:26.060695 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.9306582479999999 podStartE2EDuration="12.060679382s" podCreationTimestamp="2026-03-18 09:24:14 +0000 UTC" firstStartedPulling="2026-03-18 09:24:15.141494592 +0000 UTC m=+1321.716239452" lastFinishedPulling="2026-03-18 09:24:25.271515746 +0000 UTC m=+1331.846260586" observedRunningTime="2026-03-18 09:24:26.053251609 +0000 UTC m=+1332.627996459" watchObservedRunningTime="2026-03-18 09:24:26.060679382 +0000 UTC m=+1332.635424222" Mar 18 09:24:26 crc kubenswrapper[4778]: I0318 09:24:26.966985 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.053746 4778 generic.go:334] "Generic (PLEG): container finished" podID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerID="e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9" exitCode=137 Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.053818 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.053856 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerDied","Data":"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9"} Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.053922 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d609db91-e011-4ac4-91a6-a9ba51f3918e","Type":"ContainerDied","Data":"08fbe88aeb204ddd782e3073f280061d837a707d2c10f9b95b4eb6828823ed41"} Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.053948 4778 scope.go:117] "RemoveContainer" containerID="e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.105515 4778 scope.go:117] "RemoveContainer" containerID="19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.116469 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.116658 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.116734 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.117289 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.117367 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t77rr\" (UniqueName: \"kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.117418 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.117456 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.117468 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd\") pod \"d609db91-e011-4ac4-91a6-a9ba51f3918e\" (UID: \"d609db91-e011-4ac4-91a6-a9ba51f3918e\") " Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.118282 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.118326 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.124466 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts" (OuterVolumeSpecName: "scripts") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.129932 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr" (OuterVolumeSpecName: "kube-api-access-t77rr") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "kube-api-access-t77rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.130286 4778 scope.go:117] "RemoveContainer" containerID="d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.152916 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.213775 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220510 4778 scope.go:117] "RemoveContainer" containerID="7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220850 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220881 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220894 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t77rr\" (UniqueName: \"kubernetes.io/projected/d609db91-e011-4ac4-91a6-a9ba51f3918e-kube-api-access-t77rr\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220904 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.220916 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d609db91-e011-4ac4-91a6-a9ba51f3918e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.241727 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data" (OuterVolumeSpecName: "config-data") pod "d609db91-e011-4ac4-91a6-a9ba51f3918e" (UID: "d609db91-e011-4ac4-91a6-a9ba51f3918e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.260688 4778 scope.go:117] "RemoveContainer" containerID="e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.261217 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9\": container with ID starting with e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9 not found: ID does not exist" containerID="e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.261313 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9"} err="failed to get container status \"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9\": rpc error: code = NotFound desc = could not find container \"e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9\": container with ID starting with e800b4adae76623faed5a41717ca45a53efd658941b96af96baf8ea1395faeb9 not found: ID does not exist" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.261404 4778 scope.go:117] "RemoveContainer" containerID="19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.262022 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4\": container with ID starting with 19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4 not found: ID does not exist" containerID="19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.262087 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4"} err="failed to get container status \"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4\": rpc error: code = NotFound desc = could not find container \"19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4\": container with ID starting with 19e48159f453a10a5bc7a5bc0a544f38338e355d3917ed942cd126923f35e2f4 not found: ID does not exist" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.262123 4778 scope.go:117] "RemoveContainer" containerID="d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.262322 4778 scope.go:117] "RemoveContainer" containerID="392df7bab826882632f84664710c42da5399ea661a1dbfcd15aec0ad5d248553" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.262590 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053\": container with ID starting with d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053 not found: ID does not exist" containerID="d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.262646 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053"} err="failed to get container status \"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053\": rpc error: code = NotFound desc = could not find container \"d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053\": container with ID starting with d248d3aec4de6a9e266d95877378f418b89371166e1ecbbbc9ee9fa7d9a3e053 not found: ID does not exist" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.262677 4778 scope.go:117] "RemoveContainer" containerID="7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.263107 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372\": container with ID starting with 7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372 not found: ID does not exist" containerID="7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.263189 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372"} err="failed to get container status \"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372\": rpc error: code = NotFound desc = could not find container \"7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372\": container with ID starting with 7d8b2b44fc0045925889a25abff5d633edd32b3c5a330ff0f3838f2bea29e372 not found: ID does not exist" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.322849 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d609db91-e011-4ac4-91a6-a9ba51f3918e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.405889 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.416837 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.433358 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.434111 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-notification-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.434234 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-notification-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.434332 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="proxy-httpd" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.434410 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="proxy-httpd" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.434487 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-central-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.434536 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-central-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: E0318 09:24:27.434601 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="sg-core" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.434725 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="sg-core" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.434977 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-central-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.435038 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="sg-core" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.435097 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="proxy-httpd" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.435161 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" containerName="ceilometer-notification-agent" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.436891 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.439943 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.440833 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.450326 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.629430 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.629712 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.629850 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.630040 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.630170 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66lc2\" (UniqueName: \"kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.630285 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.630399 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.732429 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.732722 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66lc2\" (UniqueName: \"kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.732826 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.732895 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.733002 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.733072 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.733153 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.733221 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.733503 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.736504 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.736813 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.736881 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.738000 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.752217 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66lc2\" (UniqueName: \"kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2\") pod \"ceilometer-0\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " pod="openstack/ceilometer-0" Mar 18 09:24:27 crc kubenswrapper[4778]: I0318 09:24:27.773240 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:28 crc kubenswrapper[4778]: I0318 09:24:28.196111 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d609db91-e011-4ac4-91a6-a9ba51f3918e" path="/var/lib/kubelet/pods/d609db91-e011-4ac4-91a6-a9ba51f3918e/volumes" Mar 18 09:24:28 crc kubenswrapper[4778]: I0318 09:24:28.238963 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:28 crc kubenswrapper[4778]: I0318 09:24:28.798509 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.094937 4778 generic.go:334] "Generic (PLEG): container finished" podID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerID="d66d99949ad5120620bffa75ea4a3e49aee89967a841972aa06a4a9867cff673" exitCode=137 Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.095139 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerDied","Data":"d66d99949ad5120620bffa75ea4a3e49aee89967a841972aa06a4a9867cff673"} Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.115661 4778 generic.go:334] "Generic (PLEG): container finished" podID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerID="c901fa502e83a3c43053baeb3fceb6b85a02d87dc36f812db5898d2cee7e9935" exitCode=137 Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.115722 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerDied","Data":"c901fa502e83a3c43053baeb3fceb6b85a02d87dc36f812db5898d2cee7e9935"} Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.118909 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerStarted","Data":"f276f5ce69a40ba1466b6ac8bed74eb71e6d22c683684e8887ac3793fd97f1ea"} Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.233462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.362906 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363318 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363416 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363438 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363469 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363499 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vxxv\" (UniqueName: \"kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv\") pod \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\" (UID: \"7ef4b958-769d-43c2-91d4-3a6cb76d3851\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.363771 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs" (OuterVolumeSpecName: "logs") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.365379 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.373095 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.373442 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv" (OuterVolumeSpecName: "kube-api-access-7vxxv") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "kube-api-access-7vxxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.379235 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts" (OuterVolumeSpecName: "scripts") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.402812 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.437118 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data" (OuterVolumeSpecName: "config-data") pod "7ef4b958-769d-43c2-91d4-3a6cb76d3851" (UID: "7ef4b958-769d-43c2-91d4-3a6cb76d3851"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467064 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef4b958-769d-43c2-91d4-3a6cb76d3851-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467090 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467103 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467114 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467125 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ef4b958-769d-43c2-91d4-3a6cb76d3851-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467134 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef4b958-769d-43c2-91d4-3a6cb76d3851-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.467143 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vxxv\" (UniqueName: \"kubernetes.io/projected/7ef4b958-769d-43c2-91d4-3a6cb76d3851-kube-api-access-7vxxv\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.811493 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpxvx\" (UniqueName: \"kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994216 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994286 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994387 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994412 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994469 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994497 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key\") pod \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\" (UID: \"ea1f8a48-d595-4f4e-a740-0af5a26397a5\") " Mar 18 09:24:29 crc kubenswrapper[4778]: I0318 09:24:29.994966 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs" (OuterVolumeSpecName: "logs") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.000415 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.002791 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx" (OuterVolumeSpecName: "kube-api-access-wpxvx") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "kube-api-access-wpxvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.020794 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts" (OuterVolumeSpecName: "scripts") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.077956 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data" (OuterVolumeSpecName: "config-data") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.080702 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098830 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098876 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f8a48-d595-4f4e-a740-0af5a26397a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098891 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098907 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpxvx\" (UniqueName: \"kubernetes.io/projected/ea1f8a48-d595-4f4e-a740-0af5a26397a5-kube-api-access-wpxvx\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098919 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1f8a48-d595-4f4e-a740-0af5a26397a5-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.098928 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.124543 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ea1f8a48-d595-4f4e-a740-0af5a26397a5" (UID: "ea1f8a48-d595-4f4e-a740-0af5a26397a5"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.158189 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerStarted","Data":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.168278 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-99c8bfc86-rldfg" event={"ID":"ea1f8a48-d595-4f4e-a740-0af5a26397a5","Type":"ContainerDied","Data":"c77fd4278a90c239273c01a79ef12824477ebf4a1fc89be85a96364b2e982560"} Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.168335 4778 scope.go:117] "RemoveContainer" containerID="c97b0574c01538fa1c2084f0fe2463a0e870b56db44f51730acba68dbd0ebf25" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.168433 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-99c8bfc86-rldfg" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.177866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ef4b958-769d-43c2-91d4-3a6cb76d3851","Type":"ContainerDied","Data":"158c8e2542c97971367812456984ca4e3f98182f67d2f9c5b6c2354ec14b4a85"} Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.177974 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.200422 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea1f8a48-d595-4f4e-a740-0af5a26397a5-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.274030 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.305880 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-99c8bfc86-rldfg"] Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.325676 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.339413 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.349331 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:24:30 crc kubenswrapper[4778]: E0318 09:24:30.349873 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api-log" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.349889 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api-log" Mar 18 09:24:30 crc kubenswrapper[4778]: E0318 09:24:30.349914 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.349921 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" Mar 18 09:24:30 crc kubenswrapper[4778]: E0318 09:24:30.349930 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon-log" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.349937 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon-log" Mar 18 09:24:30 crc kubenswrapper[4778]: E0318 09:24:30.349950 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.349956 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.350165 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.350178 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon-log" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.350213 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" containerName="cinder-api-log" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.350227 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" containerName="horizon" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.351322 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.356415 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.356567 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.357218 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.363226 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.388529 4778 scope.go:117] "RemoveContainer" containerID="d66d99949ad5120620bffa75ea4a3e49aee89967a841972aa06a4a9867cff673" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.410986 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-etc-machine-id\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411122 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-logs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411154 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data-custom\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411206 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wmzv\" (UniqueName: \"kubernetes.io/projected/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-kube-api-access-7wmzv\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411241 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411268 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-scripts\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411287 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-public-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.411308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.430321 4778 scope.go:117] "RemoveContainer" containerID="c901fa502e83a3c43053baeb3fceb6b85a02d87dc36f812db5898d2cee7e9935" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.463356 4778 scope.go:117] "RemoveContainer" containerID="825b48041531c5b0c01f329609e8aff59582830809964a4de416fba30a539f48" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512309 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data-custom\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512394 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wmzv\" (UniqueName: \"kubernetes.io/projected/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-kube-api-access-7wmzv\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512439 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512473 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-scripts\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512495 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-public-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.512520 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.513240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-etc-machine-id\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.513268 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.513431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-etc-machine-id\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.514082 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-logs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.514388 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-logs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.517010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-public-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.517431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.517599 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-scripts\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.519568 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.519619 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data-custom\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.522243 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-config-data\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.534877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wmzv\" (UniqueName: \"kubernetes.io/projected/924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51-kube-api-access-7wmzv\") pod \"cinder-api-0\" (UID: \"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51\") " pod="openstack/cinder-api-0" Mar 18 09:24:30 crc kubenswrapper[4778]: I0318 09:24:30.677893 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.128568 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.204097 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerStarted","Data":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.223157 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.225841 4778 generic.go:334] "Generic (PLEG): container finished" podID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerID="5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827" exitCode=0 Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.225882 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerDied","Data":"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827"} Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.225911 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cffc84f44-vtx7x" event={"ID":"3d1d399f-3c89-4aaa-bba1-ce1a91358455","Type":"ContainerDied","Data":"e444d25da091179b3622d7408ec0b6e7caa7c81b27414dca1d4252c8b3fb5441"} Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.225928 4778 scope.go:117] "RemoveContainer" containerID="6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.226050 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cffc84f44-vtx7x" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.231863 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config\") pod \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.231917 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config\") pod \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.231976 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnk27\" (UniqueName: \"kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27\") pod \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.232003 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle\") pod \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.232149 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs\") pod \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\" (UID: \"3d1d399f-3c89-4aaa-bba1-ce1a91358455\") " Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.238319 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3d1d399f-3c89-4aaa-bba1-ce1a91358455" (UID: "3d1d399f-3c89-4aaa-bba1-ce1a91358455"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.245358 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27" (OuterVolumeSpecName: "kube-api-access-qnk27") pod "3d1d399f-3c89-4aaa-bba1-ce1a91358455" (UID: "3d1d399f-3c89-4aaa-bba1-ce1a91358455"). InnerVolumeSpecName "kube-api-access-qnk27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.299987 4778 scope.go:117] "RemoveContainer" containerID="5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.334508 4778 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.334568 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnk27\" (UniqueName: \"kubernetes.io/projected/3d1d399f-3c89-4aaa-bba1-ce1a91358455-kube-api-access-qnk27\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.338046 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config" (OuterVolumeSpecName: "config") pod "3d1d399f-3c89-4aaa-bba1-ce1a91358455" (UID: "3d1d399f-3c89-4aaa-bba1-ce1a91358455"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.363850 4778 scope.go:117] "RemoveContainer" containerID="6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3" Mar 18 09:24:31 crc kubenswrapper[4778]: E0318 09:24:31.364886 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3\": container with ID starting with 6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3 not found: ID does not exist" containerID="6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.364930 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3"} err="failed to get container status \"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3\": rpc error: code = NotFound desc = could not find container \"6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3\": container with ID starting with 6961f9b4f171ec7ff88f12c033e19d8ffc73764e840a08b184d683f4cf1cfab3 not found: ID does not exist" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.364962 4778 scope.go:117] "RemoveContainer" containerID="5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827" Mar 18 09:24:31 crc kubenswrapper[4778]: E0318 09:24:31.365406 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827\": container with ID starting with 5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827 not found: ID does not exist" containerID="5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.365440 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827"} err="failed to get container status \"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827\": rpc error: code = NotFound desc = could not find container \"5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827\": container with ID starting with 5a231d6ea0c8dcd4e5d2466aa09d9a7722c5c48772262052edfae0c719509827 not found: ID does not exist" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.369555 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d1d399f-3c89-4aaa-bba1-ce1a91358455" (UID: "3d1d399f-3c89-4aaa-bba1-ce1a91358455"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.389387 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3d1d399f-3c89-4aaa-bba1-ce1a91358455" (UID: "3d1d399f-3c89-4aaa-bba1-ce1a91358455"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.436466 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.436516 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.436531 4778 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1d399f-3c89-4aaa-bba1-ce1a91358455-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.562446 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:24:31 crc kubenswrapper[4778]: I0318 09:24:31.570997 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-cffc84f44-vtx7x"] Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.198239 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" path="/var/lib/kubelet/pods/3d1d399f-3c89-4aaa-bba1-ce1a91358455/volumes" Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.199326 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef4b958-769d-43c2-91d4-3a6cb76d3851" path="/var/lib/kubelet/pods/7ef4b958-769d-43c2-91d4-3a6cb76d3851/volumes" Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.199987 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea1f8a48-d595-4f4e-a740-0af5a26397a5" path="/var/lib/kubelet/pods/ea1f8a48-d595-4f4e-a740-0af5a26397a5/volumes" Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.237982 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerStarted","Data":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.241720 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51","Type":"ContainerStarted","Data":"f62b79dc1c13ddbdcdf1ff9979e4795dab48a8b05a7a40c92da85226bd3dfb12"} Mar 18 09:24:32 crc kubenswrapper[4778]: I0318 09:24:32.241764 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51","Type":"ContainerStarted","Data":"7ad59e74d5727c6448c429922f9615f73c95156d0fab49db22041f221c72e6d7"} Mar 18 09:24:33 crc kubenswrapper[4778]: I0318 09:24:33.265620 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51","Type":"ContainerStarted","Data":"cf14eeba4375e3a27d3011a3de7d5dc3a5ef6779728ba976290726eeff2d4ccd"} Mar 18 09:24:33 crc kubenswrapper[4778]: I0318 09:24:33.266386 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.233693 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.233625846 podStartE2EDuration="4.233625846s" podCreationTimestamp="2026-03-18 09:24:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:24:33.295937015 +0000 UTC m=+1339.870682055" watchObservedRunningTime="2026-03-18 09:24:34.233625846 +0000 UTC m=+1340.808370696" Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.296723 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-central-agent" containerID="cri-o://1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" gracePeriod=30 Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.297115 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerStarted","Data":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.297185 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.297645 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="proxy-httpd" containerID="cri-o://1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" gracePeriod=30 Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.297708 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="sg-core" containerID="cri-o://d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" gracePeriod=30 Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.297764 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-notification-agent" containerID="cri-o://98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" gracePeriod=30 Mar 18 09:24:34 crc kubenswrapper[4778]: I0318 09:24:34.331037 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.42741177 podStartE2EDuration="7.331007079s" podCreationTimestamp="2026-03-18 09:24:27 +0000 UTC" firstStartedPulling="2026-03-18 09:24:28.248808565 +0000 UTC m=+1334.823553405" lastFinishedPulling="2026-03-18 09:24:33.152403874 +0000 UTC m=+1339.727148714" observedRunningTime="2026-03-18 09:24:34.322263801 +0000 UTC m=+1340.897008661" watchObservedRunningTime="2026-03-18 09:24:34.331007079 +0000 UTC m=+1340.905751959" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.147670 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307747 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1285385-097b-434c-8e95-dc27069185e1" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" exitCode=0 Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307777 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1285385-097b-434c-8e95-dc27069185e1" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" exitCode=2 Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307788 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1285385-097b-434c-8e95-dc27069185e1" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" exitCode=0 Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307796 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1285385-097b-434c-8e95-dc27069185e1" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" exitCode=0 Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerDied","Data":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307839 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerDied","Data":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerDied","Data":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307846 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307871 4778 scope.go:117] "RemoveContainer" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307860 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerDied","Data":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.307987 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1285385-097b-434c-8e95-dc27069185e1","Type":"ContainerDied","Data":"f276f5ce69a40ba1466b6ac8bed74eb71e6d22c683684e8887ac3793fd97f1ea"} Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.311920 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66lc2\" (UniqueName: \"kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312013 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312045 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312155 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312256 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312289 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312322 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle\") pod \"d1285385-097b-434c-8e95-dc27069185e1\" (UID: \"d1285385-097b-434c-8e95-dc27069185e1\") " Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312616 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.312648 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.313763 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.313947 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1285385-097b-434c-8e95-dc27069185e1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.316967 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2" (OuterVolumeSpecName: "kube-api-access-66lc2") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "kube-api-access-66lc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.317359 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts" (OuterVolumeSpecName: "scripts") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.328886 4778 scope.go:117] "RemoveContainer" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.336940 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.349365 4778 scope.go:117] "RemoveContainer" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.379018 4778 scope.go:117] "RemoveContainer" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.387631 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.408955 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data" (OuterVolumeSpecName: "config-data") pod "d1285385-097b-434c-8e95-dc27069185e1" (UID: "d1285385-097b-434c-8e95-dc27069185e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.411465 4778 scope.go:117] "RemoveContainer" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.412150 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": container with ID starting with 1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64 not found: ID does not exist" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.412273 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} err="failed to get container status \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": rpc error: code = NotFound desc = could not find container \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": container with ID starting with 1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.412318 4778 scope.go:117] "RemoveContainer" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.412844 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": container with ID starting with d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8 not found: ID does not exist" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.412884 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} err="failed to get container status \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": rpc error: code = NotFound desc = could not find container \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": container with ID starting with d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.412906 4778 scope.go:117] "RemoveContainer" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.413400 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": container with ID starting with 98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46 not found: ID does not exist" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.413447 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} err="failed to get container status \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": rpc error: code = NotFound desc = could not find container \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": container with ID starting with 98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.413471 4778 scope.go:117] "RemoveContainer" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.413904 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": container with ID starting with 1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e not found: ID does not exist" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.413934 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} err="failed to get container status \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": rpc error: code = NotFound desc = could not find container \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": container with ID starting with 1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.413954 4778 scope.go:117] "RemoveContainer" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.414497 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} err="failed to get container status \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": rpc error: code = NotFound desc = could not find container \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": container with ID starting with 1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.414549 4778 scope.go:117] "RemoveContainer" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.414895 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} err="failed to get container status \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": rpc error: code = NotFound desc = could not find container \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": container with ID starting with d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.414919 4778 scope.go:117] "RemoveContainer" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415287 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} err="failed to get container status \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": rpc error: code = NotFound desc = could not find container \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": container with ID starting with 98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415391 4778 scope.go:117] "RemoveContainer" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415818 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} err="failed to get container status \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": rpc error: code = NotFound desc = could not find container \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": container with ID starting with 1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415841 4778 scope.go:117] "RemoveContainer" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415855 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.415887 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66lc2\" (UniqueName: \"kubernetes.io/projected/d1285385-097b-434c-8e95-dc27069185e1-kube-api-access-66lc2\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416107 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416123 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416135 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1285385-097b-434c-8e95-dc27069185e1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416316 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} err="failed to get container status \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": rpc error: code = NotFound desc = could not find container \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": container with ID starting with 1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416336 4778 scope.go:117] "RemoveContainer" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416679 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} err="failed to get container status \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": rpc error: code = NotFound desc = could not find container \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": container with ID starting with d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.416698 4778 scope.go:117] "RemoveContainer" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417019 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} err="failed to get container status \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": rpc error: code = NotFound desc = could not find container \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": container with ID starting with 98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417038 4778 scope.go:117] "RemoveContainer" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417333 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} err="failed to get container status \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": rpc error: code = NotFound desc = could not find container \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": container with ID starting with 1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417351 4778 scope.go:117] "RemoveContainer" containerID="1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417655 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64"} err="failed to get container status \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": rpc error: code = NotFound desc = could not find container \"1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64\": container with ID starting with 1ccf034c6a21a7747fcaacc2f79e55d7a08e8d7ef35b5ff4e2cddd1eae5f2d64 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417675 4778 scope.go:117] "RemoveContainer" containerID="d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.417981 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8"} err="failed to get container status \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": rpc error: code = NotFound desc = could not find container \"d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8\": container with ID starting with d7227aa35eab95bdc509ce358d3a783f4bceffb5f6ff72421e46707da6ab0da8 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.418000 4778 scope.go:117] "RemoveContainer" containerID="98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.418369 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46"} err="failed to get container status \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": rpc error: code = NotFound desc = could not find container \"98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46\": container with ID starting with 98171290e2b3b99388647847ed4509a2b11f643f1cf00f8c64958fa23bd47b46 not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.418414 4778 scope.go:117] "RemoveContainer" containerID="1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.418756 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e"} err="failed to get container status \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": rpc error: code = NotFound desc = could not find container \"1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e\": container with ID starting with 1d9fc9f682325c8351620d3e791c32910b810d023603b8a3795e384e7983773e not found: ID does not exist" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.640064 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.649269 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678309 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678753 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="sg-core" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678774 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="sg-core" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678804 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-notification-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678812 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-notification-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678830 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="proxy-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678838 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="proxy-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678853 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-central-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678861 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-central-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678878 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678886 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: E0318 09:24:35.678909 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-api" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.678917 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-api" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679113 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-notification-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679130 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-api" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679145 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="sg-core" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679160 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1d399f-3c89-4aaa-bba1-ce1a91358455" containerName="neutron-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679174 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="ceilometer-central-agent" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.679191 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1285385-097b-434c-8e95-dc27069185e1" containerName="proxy-httpd" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.681702 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.684802 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.688354 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.702854 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824073 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824234 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824268 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824301 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824325 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824350 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.824376 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bstbn\" (UniqueName: \"kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.926533 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.926703 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.926764 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927376 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927410 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927429 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bstbn\" (UniqueName: \"kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927713 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.927877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.931101 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.932905 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.935021 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.947349 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:35 crc kubenswrapper[4778]: I0318 09:24:35.952488 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bstbn\" (UniqueName: \"kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn\") pod \"ceilometer-0\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " pod="openstack/ceilometer-0" Mar 18 09:24:36 crc kubenswrapper[4778]: I0318 09:24:36.025729 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:36 crc kubenswrapper[4778]: I0318 09:24:36.212706 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1285385-097b-434c-8e95-dc27069185e1" path="/var/lib/kubelet/pods/d1285385-097b-434c-8e95-dc27069185e1/volumes" Mar 18 09:24:36 crc kubenswrapper[4778]: I0318 09:24:36.486097 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:37 crc kubenswrapper[4778]: I0318 09:24:37.337857 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerStarted","Data":"4c0e3af08cf9b31b7b8c65fe63263681dd011db44c530c1549fcf2d8b00c430f"} Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.115469 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-nl2dg"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.116820 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.123857 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nl2dg"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.221597 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9hlqk"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.223172 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.237067 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9hlqk"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.273099 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjp78\" (UniqueName: \"kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.273379 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.303916 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-t5x58"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.304938 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.314382 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t5x58"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.326242 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-922c-account-create-update-6z2xf"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.327388 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.329040 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.336576 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-922c-account-create-update-6z2xf"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.376422 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjp78\" (UniqueName: \"kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.376556 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.376625 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ccbt\" (UniqueName: \"kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.376660 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.379932 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.402097 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjp78\" (UniqueName: \"kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78\") pod \"nova-api-db-create-nl2dg\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.441070 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478735 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478823 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478859 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478904 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t86pm\" (UniqueName: \"kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478933 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ccbt\" (UniqueName: \"kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.478961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.479832 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.498256 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ccbt\" (UniqueName: \"kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt\") pod \"nova-cell0-db-create-9hlqk\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.519639 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-722f-account-create-update-slwd5"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.520644 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.522757 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.539330 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.546145 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-722f-account-create-update-slwd5"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.601270 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.601525 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.601572 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.601640 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t86pm\" (UniqueName: \"kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.604058 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.633932 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.637572 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw\") pod \"nova-api-922c-account-create-update-6z2xf\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.642572 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t86pm\" (UniqueName: \"kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm\") pod \"nova-cell1-db-create-t5x58\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.649773 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.671515 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b9f9-account-create-update-w7q2n"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.672943 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.676328 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.692730 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b9f9-account-create-update-w7q2n"] Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.704084 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.704177 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7plt5\" (UniqueName: \"kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.805366 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.805449 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r58f4\" (UniqueName: \"kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.805478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7plt5\" (UniqueName: \"kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.805540 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.807511 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.833875 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7plt5\" (UniqueName: \"kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5\") pod \"nova-cell0-722f-account-create-update-slwd5\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.906680 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r58f4\" (UniqueName: \"kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.906773 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.908735 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.920681 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:38 crc kubenswrapper[4778]: I0318 09:24:38.927609 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r58f4\" (UniqueName: \"kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4\") pod \"nova-cell1-b9f9-account-create-update-w7q2n\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.105478 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.105968 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nl2dg"] Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.117010 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.212139 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9hlqk"] Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.224927 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-922c-account-create-update-6z2xf"] Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.393368 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9hlqk" event={"ID":"92444732-2d3e-4065-a336-74b37b711530","Type":"ContainerStarted","Data":"3333452c0f9c0aa6c9b681d6899b93de37fd7ca4bbeb37555194646d5e14f4aa"} Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.409529 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-922c-account-create-update-6z2xf" event={"ID":"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e","Type":"ContainerStarted","Data":"f6194e96f072cfd53c406f39a7550f29db5d67bdc6367c173d9028ecd67df647"} Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.434520 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerStarted","Data":"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59"} Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.438345 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t5x58"] Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.442331 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nl2dg" event={"ID":"8341ceba-13e0-410f-a7d2-23190a07d914","Type":"ContainerStarted","Data":"d2ab977039aaae5c8f1427bcc52b3dcaad0308d3866ed320458f83ac1d2b6005"} Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.691315 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-nl2dg" podStartSLOduration=1.6912805789999998 podStartE2EDuration="1.691280579s" podCreationTimestamp="2026-03-18 09:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:24:39.466791263 +0000 UTC m=+1346.041536113" watchObservedRunningTime="2026-03-18 09:24:39.691280579 +0000 UTC m=+1346.266025419" Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.699143 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b9f9-account-create-update-w7q2n"] Mar 18 09:24:39 crc kubenswrapper[4778]: I0318 09:24:39.797631 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-722f-account-create-update-slwd5"] Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.452731 4778 generic.go:334] "Generic (PLEG): container finished" podID="b380dfb3-b55b-4db2-bd8f-a90b4470345d" containerID="c118c28760c4816bb842a36e485ff938333b6ae9902cf9242267aa191e3d70bf" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.453758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t5x58" event={"ID":"b380dfb3-b55b-4db2-bd8f-a90b4470345d","Type":"ContainerDied","Data":"c118c28760c4816bb842a36e485ff938333b6ae9902cf9242267aa191e3d70bf"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.453800 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t5x58" event={"ID":"b380dfb3-b55b-4db2-bd8f-a90b4470345d","Type":"ContainerStarted","Data":"2cf69c2fb75c3f078b4c9042bf5f15177ce33c7cf16cfe38dde0358c80b6c73f"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.469815 4778 generic.go:334] "Generic (PLEG): container finished" podID="92444732-2d3e-4065-a336-74b37b711530" containerID="7fb36f99fa48f9c60dbdcb8445fed2d769e9cb712ffc10c71b7ff46632229d69" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.469895 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9hlqk" event={"ID":"92444732-2d3e-4065-a336-74b37b711530","Type":"ContainerDied","Data":"7fb36f99fa48f9c60dbdcb8445fed2d769e9cb712ffc10c71b7ff46632229d69"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.479806 4778 generic.go:334] "Generic (PLEG): container finished" podID="b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" containerID="3cc34e35f2db07df2220b6c334d24c112405b578f89727d873a592536bc78998" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.480155 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-922c-account-create-update-6z2xf" event={"ID":"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e","Type":"ContainerDied","Data":"3cc34e35f2db07df2220b6c334d24c112405b578f89727d873a592536bc78998"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.489383 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerStarted","Data":"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.494748 4778 generic.go:334] "Generic (PLEG): container finished" podID="2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" containerID="47bfce503465075386d4ab81517eb08824a50d2ca76a4ab55639a7aea5948d36" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.494845 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" event={"ID":"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d","Type":"ContainerDied","Data":"47bfce503465075386d4ab81517eb08824a50d2ca76a4ab55639a7aea5948d36"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.494874 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" event={"ID":"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d","Type":"ContainerStarted","Data":"50d29b828e5b4eae76fdf54a3bce80608ad76949a8ff5e9b561728be802b7516"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.510473 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nl2dg" event={"ID":"8341ceba-13e0-410f-a7d2-23190a07d914","Type":"ContainerDied","Data":"9866f0cece8384eb6d69125fd4f2648001a15f8207d97598a6f6b380c668253f"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.520691 4778 generic.go:334] "Generic (PLEG): container finished" podID="8341ceba-13e0-410f-a7d2-23190a07d914" containerID="9866f0cece8384eb6d69125fd4f2648001a15f8207d97598a6f6b380c668253f" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.531045 4778 generic.go:334] "Generic (PLEG): container finished" podID="2f06b776-36bc-45ba-88d4-69608f9665e6" containerID="b0ad59dfbfbe8f98b2a7024fc11350f06ab712f37850bffb7121c440c9344960" exitCode=0 Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.531104 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-722f-account-create-update-slwd5" event={"ID":"2f06b776-36bc-45ba-88d4-69608f9665e6","Type":"ContainerDied","Data":"b0ad59dfbfbe8f98b2a7024fc11350f06ab712f37850bffb7121c440c9344960"} Mar 18 09:24:40 crc kubenswrapper[4778]: I0318 09:24:40.531134 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-722f-account-create-update-slwd5" event={"ID":"2f06b776-36bc-45ba-88d4-69608f9665e6","Type":"ContainerStarted","Data":"7ee8c9776efd482ad7a36b5754dd4923536b7bbd44753fa87fbbf5f1cfa338b4"} Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.130874 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.215463 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7588d8786-t6x7l" Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.295954 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.296357 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7b57877776-ssjzt" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-log" containerID="cri-o://8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7" gracePeriod=30 Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.296455 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7b57877776-ssjzt" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-api" containerID="cri-o://2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f" gracePeriod=30 Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.543921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerStarted","Data":"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe"} Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.547063 4778 generic.go:334] "Generic (PLEG): container finished" podID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerID="8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7" exitCode=143 Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.547329 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerDied","Data":"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7"} Mar 18 09:24:41 crc kubenswrapper[4778]: I0318 09:24:41.966334 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.086351 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" (UID: "b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.089342 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts\") pod \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.089517 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw\") pod \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\" (UID: \"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.091056 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.118213 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw" (OuterVolumeSpecName: "kube-api-access-4xcpw") pod "b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" (UID: "b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e"). InnerVolumeSpecName "kube-api-access-4xcpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.193655 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e-kube-api-access-4xcpw\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.229946 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.261524 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.271186 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.285254 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.294319 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts\") pod \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397704 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t86pm\" (UniqueName: \"kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm\") pod \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\" (UID: \"b380dfb3-b55b-4db2-bd8f-a90b4470345d\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397737 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ccbt\" (UniqueName: \"kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt\") pod \"92444732-2d3e-4065-a336-74b37b711530\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397812 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7plt5\" (UniqueName: \"kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5\") pod \"2f06b776-36bc-45ba-88d4-69608f9665e6\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397838 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts\") pod \"2f06b776-36bc-45ba-88d4-69608f9665e6\" (UID: \"2f06b776-36bc-45ba-88d4-69608f9665e6\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.397943 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjp78\" (UniqueName: \"kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78\") pod \"8341ceba-13e0-410f-a7d2-23190a07d914\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.398103 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts\") pod \"8341ceba-13e0-410f-a7d2-23190a07d914\" (UID: \"8341ceba-13e0-410f-a7d2-23190a07d914\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.398131 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r58f4\" (UniqueName: \"kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4\") pod \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.398170 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts\") pod \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\" (UID: \"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.398227 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts\") pod \"92444732-2d3e-4065-a336-74b37b711530\" (UID: \"92444732-2d3e-4065-a336-74b37b711530\") " Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.399098 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8341ceba-13e0-410f-a7d2-23190a07d914" (UID: "8341ceba-13e0-410f-a7d2-23190a07d914"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.399417 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f06b776-36bc-45ba-88d4-69608f9665e6" (UID: "2f06b776-36bc-45ba-88d4-69608f9665e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.399455 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92444732-2d3e-4065-a336-74b37b711530" (UID: "92444732-2d3e-4065-a336-74b37b711530"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.399699 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b380dfb3-b55b-4db2-bd8f-a90b4470345d" (UID: "b380dfb3-b55b-4db2-bd8f-a90b4470345d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.399953 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" (UID: "2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.404419 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4" (OuterVolumeSpecName: "kube-api-access-r58f4") pod "2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" (UID: "2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d"). InnerVolumeSpecName "kube-api-access-r58f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.405375 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt" (OuterVolumeSpecName: "kube-api-access-5ccbt") pod "92444732-2d3e-4065-a336-74b37b711530" (UID: "92444732-2d3e-4065-a336-74b37b711530"). InnerVolumeSpecName "kube-api-access-5ccbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.408367 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78" (OuterVolumeSpecName: "kube-api-access-xjp78") pod "8341ceba-13e0-410f-a7d2-23190a07d914" (UID: "8341ceba-13e0-410f-a7d2-23190a07d914"). InnerVolumeSpecName "kube-api-access-xjp78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.408453 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm" (OuterVolumeSpecName: "kube-api-access-t86pm") pod "b380dfb3-b55b-4db2-bd8f-a90b4470345d" (UID: "b380dfb3-b55b-4db2-bd8f-a90b4470345d"). InnerVolumeSpecName "kube-api-access-t86pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.414319 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5" (OuterVolumeSpecName: "kube-api-access-7plt5") pod "2f06b776-36bc-45ba-88d4-69608f9665e6" (UID: "2f06b776-36bc-45ba-88d4-69608f9665e6"). InnerVolumeSpecName "kube-api-access-7plt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499868 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f06b776-36bc-45ba-88d4-69608f9665e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499904 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjp78\" (UniqueName: \"kubernetes.io/projected/8341ceba-13e0-410f-a7d2-23190a07d914-kube-api-access-xjp78\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499916 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8341ceba-13e0-410f-a7d2-23190a07d914-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499925 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r58f4\" (UniqueName: \"kubernetes.io/projected/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-kube-api-access-r58f4\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499934 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499942 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92444732-2d3e-4065-a336-74b37b711530-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499952 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b380dfb3-b55b-4db2-bd8f-a90b4470345d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499961 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t86pm\" (UniqueName: \"kubernetes.io/projected/b380dfb3-b55b-4db2-bd8f-a90b4470345d-kube-api-access-t86pm\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499970 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ccbt\" (UniqueName: \"kubernetes.io/projected/92444732-2d3e-4065-a336-74b37b711530-kube-api-access-5ccbt\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.499978 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7plt5\" (UniqueName: \"kubernetes.io/projected/2f06b776-36bc-45ba-88d4-69608f9665e6-kube-api-access-7plt5\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.558256 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-722f-account-create-update-slwd5" event={"ID":"2f06b776-36bc-45ba-88d4-69608f9665e6","Type":"ContainerDied","Data":"7ee8c9776efd482ad7a36b5754dd4923536b7bbd44753fa87fbbf5f1cfa338b4"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.558298 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ee8c9776efd482ad7a36b5754dd4923536b7bbd44753fa87fbbf5f1cfa338b4" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.558350 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-722f-account-create-update-slwd5" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.561985 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t5x58" event={"ID":"b380dfb3-b55b-4db2-bd8f-a90b4470345d","Type":"ContainerDied","Data":"2cf69c2fb75c3f078b4c9042bf5f15177ce33c7cf16cfe38dde0358c80b6c73f"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.562010 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5x58" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.562028 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cf69c2fb75c3f078b4c9042bf5f15177ce33c7cf16cfe38dde0358c80b6c73f" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.564179 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9hlqk" event={"ID":"92444732-2d3e-4065-a336-74b37b711530","Type":"ContainerDied","Data":"3333452c0f9c0aa6c9b681d6899b93de37fd7ca4bbeb37555194646d5e14f4aa"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.564192 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9hlqk" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.564221 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3333452c0f9c0aa6c9b681d6899b93de37fd7ca4bbeb37555194646d5e14f4aa" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.565755 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-922c-account-create-update-6z2xf" event={"ID":"b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e","Type":"ContainerDied","Data":"f6194e96f072cfd53c406f39a7550f29db5d67bdc6367c173d9028ecd67df647"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.565804 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6194e96f072cfd53c406f39a7550f29db5d67bdc6367c173d9028ecd67df647" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.565769 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-922c-account-create-update-6z2xf" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.569331 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerStarted","Data":"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.569781 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.573963 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" event={"ID":"2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d","Type":"ContainerDied","Data":"50d29b828e5b4eae76fdf54a3bce80608ad76949a8ff5e9b561728be802b7516"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.573993 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d29b828e5b4eae76fdf54a3bce80608ad76949a8ff5e9b561728be802b7516" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.574044 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9f9-account-create-update-w7q2n" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.580865 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nl2dg" event={"ID":"8341ceba-13e0-410f-a7d2-23190a07d914","Type":"ContainerDied","Data":"d2ab977039aaae5c8f1427bcc52b3dcaad0308d3866ed320458f83ac1d2b6005"} Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.580936 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2ab977039aaae5c8f1427bcc52b3dcaad0308d3866ed320458f83ac1d2b6005" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.580958 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nl2dg" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.613846 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.059274284 podStartE2EDuration="7.613815856s" podCreationTimestamp="2026-03-18 09:24:35 +0000 UTC" firstStartedPulling="2026-03-18 09:24:36.505734726 +0000 UTC m=+1343.080479566" lastFinishedPulling="2026-03-18 09:24:42.060276298 +0000 UTC m=+1348.635021138" observedRunningTime="2026-03-18 09:24:42.60184264 +0000 UTC m=+1349.176587490" watchObservedRunningTime="2026-03-18 09:24:42.613815856 +0000 UTC m=+1349.188560696" Mar 18 09:24:42 crc kubenswrapper[4778]: I0318 09:24:42.856924 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 09:24:44 crc kubenswrapper[4778]: I0318 09:24:44.873879 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:44 crc kubenswrapper[4778]: I0318 09:24:44.874921 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-central-agent" containerID="cri-o://2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59" gracePeriod=30 Mar 18 09:24:44 crc kubenswrapper[4778]: I0318 09:24:44.875052 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="proxy-httpd" containerID="cri-o://4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f" gracePeriod=30 Mar 18 09:24:44 crc kubenswrapper[4778]: I0318 09:24:44.875103 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="sg-core" containerID="cri-o://4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe" gracePeriod=30 Mar 18 09:24:44 crc kubenswrapper[4778]: I0318 09:24:44.875139 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-notification-agent" containerID="cri-o://5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6" gracePeriod=30 Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.247085 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372081 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372151 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372289 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372344 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg9cl\" (UniqueName: \"kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372434 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.372457 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle\") pod \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\" (UID: \"9c5e9a8c-649d-4d20-b867-bb0f801d329d\") " Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.373219 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs" (OuterVolumeSpecName: "logs") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.378863 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts" (OuterVolumeSpecName: "scripts") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.378920 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl" (OuterVolumeSpecName: "kube-api-access-jg9cl") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "kube-api-access-jg9cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.424521 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.439745 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data" (OuterVolumeSpecName: "config-data") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.470237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.472755 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9c5e9a8c-649d-4d20-b867-bb0f801d329d" (UID: "9c5e9a8c-649d-4d20-b867-bb0f801d329d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474662 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5e9a8c-649d-4d20-b867-bb0f801d329d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474704 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474730 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474746 4778 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474760 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474778 4778 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e9a8c-649d-4d20-b867-bb0f801d329d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.474792 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg9cl\" (UniqueName: \"kubernetes.io/projected/9c5e9a8c-649d-4d20-b867-bb0f801d329d-kube-api-access-jg9cl\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611559 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerID="4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f" exitCode=0 Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611601 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerID="4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe" exitCode=2 Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611612 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerID="5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6" exitCode=0 Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611626 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerDied","Data":"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f"} Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611676 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerDied","Data":"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe"} Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.611688 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerDied","Data":"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6"} Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.613691 4778 generic.go:334] "Generic (PLEG): container finished" podID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerID="2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f" exitCode=0 Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.613779 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b57877776-ssjzt" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.613750 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerDied","Data":"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f"} Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.613938 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57877776-ssjzt" event={"ID":"9c5e9a8c-649d-4d20-b867-bb0f801d329d","Type":"ContainerDied","Data":"dcaa7760f0d0e632c657a22054c5f006fe1f82143f10b41d8d1cb108f3a1621b"} Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.614020 4778 scope.go:117] "RemoveContainer" containerID="2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.649263 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.655774 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7b57877776-ssjzt"] Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.661398 4778 scope.go:117] "RemoveContainer" containerID="8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.687141 4778 scope.go:117] "RemoveContainer" containerID="2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f" Mar 18 09:24:45 crc kubenswrapper[4778]: E0318 09:24:45.688324 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f\": container with ID starting with 2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f not found: ID does not exist" containerID="2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.688384 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f"} err="failed to get container status \"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f\": rpc error: code = NotFound desc = could not find container \"2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f\": container with ID starting with 2a6943842b47e81fdf74302688c9f783b62d29a00e60231411e1416c4c7d8a3f not found: ID does not exist" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.688417 4778 scope.go:117] "RemoveContainer" containerID="8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7" Mar 18 09:24:45 crc kubenswrapper[4778]: E0318 09:24:45.689423 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7\": container with ID starting with 8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7 not found: ID does not exist" containerID="8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7" Mar 18 09:24:45 crc kubenswrapper[4778]: I0318 09:24:45.689470 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7"} err="failed to get container status \"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7\": rpc error: code = NotFound desc = could not find container \"8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7\": container with ID starting with 8aed6846ec87775c8d4248f2846150eee45a60d361677b89aa55b865ab5ebea7 not found: ID does not exist" Mar 18 09:24:46 crc kubenswrapper[4778]: I0318 09:24:46.206176 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" path="/var/lib/kubelet/pods/9c5e9a8c-649d-4d20-b867-bb0f801d329d/volumes" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.480053 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vhxwz"] Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481123 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481141 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481183 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f06b776-36bc-45ba-88d4-69608f9665e6" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481191 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f06b776-36bc-45ba-88d4-69608f9665e6" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481285 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b380dfb3-b55b-4db2-bd8f-a90b4470345d" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481294 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b380dfb3-b55b-4db2-bd8f-a90b4470345d" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481307 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92444732-2d3e-4065-a336-74b37b711530" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481315 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="92444732-2d3e-4065-a336-74b37b711530" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481323 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8341ceba-13e0-410f-a7d2-23190a07d914" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481331 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8341ceba-13e0-410f-a7d2-23190a07d914" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481340 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-api" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481347 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-api" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481364 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-log" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481372 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-log" Mar 18 09:24:48 crc kubenswrapper[4778]: E0318 09:24:48.481386 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481394 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481623 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f06b776-36bc-45ba-88d4-69608f9665e6" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481649 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8341ceba-13e0-410f-a7d2-23190a07d914" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481659 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481672 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="92444732-2d3e-4065-a336-74b37b711530" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481681 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b380dfb3-b55b-4db2-bd8f-a90b4470345d" containerName="mariadb-database-create" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481690 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-log" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481701 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c5e9a8c-649d-4d20-b867-bb0f801d329d" containerName="placement-api" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.481710 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" containerName="mariadb-account-create-update" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.482489 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.485242 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.486133 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.486850 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pgv4m" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.490391 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vhxwz"] Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.545315 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.545398 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvzwc\" (UniqueName: \"kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.545427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.545465 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.647456 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvzwc\" (UniqueName: \"kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.647519 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.647571 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.647727 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.655147 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.655149 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.658739 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.668699 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvzwc\" (UniqueName: \"kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc\") pod \"nova-cell0-conductor-db-sync-vhxwz\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:48 crc kubenswrapper[4778]: I0318 09:24:48.805763 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.378334 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vhxwz"] Mar 18 09:24:49 crc kubenswrapper[4778]: W0318 09:24:49.478083 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8348daa3_112d_49f7_93d8_3649ebf10eee.slice/crio-f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f WatchSource:0}: Error finding container f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f: Status 404 returned error can't find the container with id f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.650947 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.664158 4778 generic.go:334] "Generic (PLEG): container finished" podID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerID="2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59" exitCode=0 Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.664309 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerDied","Data":"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59"} Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.664343 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.664399 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bbbb56d-564a-45df-b50c-7bc4ba290812","Type":"ContainerDied","Data":"4c0e3af08cf9b31b7b8c65fe63263681dd011db44c530c1549fcf2d8b00c430f"} Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.664427 4778 scope.go:117] "RemoveContainer" containerID="4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.666334 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" event={"ID":"8348daa3-112d-49f7-93d8-3649ebf10eee","Type":"ContainerStarted","Data":"f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f"} Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.670980 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.671106 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.671261 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.671481 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.671584 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.671629 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bstbn\" (UniqueName: \"kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.672001 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.672171 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.672869 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle\") pod \"7bbbb56d-564a-45df-b50c-7bc4ba290812\" (UID: \"7bbbb56d-564a-45df-b50c-7bc4ba290812\") " Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.673832 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.673884 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bbbb56d-564a-45df-b50c-7bc4ba290812-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.702548 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts" (OuterVolumeSpecName: "scripts") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.716590 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn" (OuterVolumeSpecName: "kube-api-access-bstbn") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "kube-api-access-bstbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.731376 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.772315 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.775075 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.775105 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bstbn\" (UniqueName: \"kubernetes.io/projected/7bbbb56d-564a-45df-b50c-7bc4ba290812-kube-api-access-bstbn\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.775119 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.775128 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.802552 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data" (OuterVolumeSpecName: "config-data") pod "7bbbb56d-564a-45df-b50c-7bc4ba290812" (UID: "7bbbb56d-564a-45df-b50c-7bc4ba290812"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.871286 4778 scope.go:117] "RemoveContainer" containerID="4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.877894 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bbbb56d-564a-45df-b50c-7bc4ba290812-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.895768 4778 scope.go:117] "RemoveContainer" containerID="5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.919214 4778 scope.go:117] "RemoveContainer" containerID="2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.937702 4778 scope.go:117] "RemoveContainer" containerID="4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f" Mar 18 09:24:49 crc kubenswrapper[4778]: E0318 09:24:49.938284 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f\": container with ID starting with 4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f not found: ID does not exist" containerID="4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.938333 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f"} err="failed to get container status \"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f\": rpc error: code = NotFound desc = could not find container \"4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f\": container with ID starting with 4a52cb4ee9e8a31fc9a08875f3a0f4b683cc5d0dd1193791a27dc5614a17e03f not found: ID does not exist" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.938362 4778 scope.go:117] "RemoveContainer" containerID="4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe" Mar 18 09:24:49 crc kubenswrapper[4778]: E0318 09:24:49.938866 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe\": container with ID starting with 4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe not found: ID does not exist" containerID="4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.938917 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe"} err="failed to get container status \"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe\": rpc error: code = NotFound desc = could not find container \"4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe\": container with ID starting with 4daa696472a41bc9ddeade71be81f4366c19965b854ef2f0299de127674153fe not found: ID does not exist" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.938945 4778 scope.go:117] "RemoveContainer" containerID="5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6" Mar 18 09:24:49 crc kubenswrapper[4778]: E0318 09:24:49.939529 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6\": container with ID starting with 5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6 not found: ID does not exist" containerID="5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.939560 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6"} err="failed to get container status \"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6\": rpc error: code = NotFound desc = could not find container \"5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6\": container with ID starting with 5aececa630d812fb403298e481953dbbc73c2ce898046bde12bbdaf70695e2a6 not found: ID does not exist" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.939578 4778 scope.go:117] "RemoveContainer" containerID="2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59" Mar 18 09:24:49 crc kubenswrapper[4778]: E0318 09:24:49.939898 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59\": container with ID starting with 2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59 not found: ID does not exist" containerID="2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59" Mar 18 09:24:49 crc kubenswrapper[4778]: I0318 09:24:49.939924 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59"} err="failed to get container status \"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59\": rpc error: code = NotFound desc = could not find container \"2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59\": container with ID starting with 2feeeb604ade966f0dd3c1b30d674eab24722c2405f09ab02528b2bf7898de59 not found: ID does not exist" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.046715 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.065067 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.077568 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:50 crc kubenswrapper[4778]: E0318 09:24:50.078255 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-central-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078284 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-central-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: E0318 09:24:50.078371 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-notification-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078385 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-notification-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: E0318 09:24:50.078416 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="sg-core" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078427 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="sg-core" Mar 18 09:24:50 crc kubenswrapper[4778]: E0318 09:24:50.078450 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="proxy-httpd" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078461 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="proxy-httpd" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078714 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-central-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078743 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="proxy-httpd" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078761 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="ceilometer-notification-agent" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.078770 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" containerName="sg-core" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.081491 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.085435 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.085567 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.087382 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.187922 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188315 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188349 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92rb8\" (UniqueName: \"kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188366 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188409 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188432 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.188458 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.197338 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbbb56d-564a-45df-b50c-7bc4ba290812" path="/var/lib/kubelet/pods/7bbbb56d-564a-45df-b50c-7bc4ba290812/volumes" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290048 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290155 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92rb8\" (UniqueName: \"kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290213 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290301 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290345 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290398 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.290452 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.292617 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.292887 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.295744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.296219 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.297487 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.298314 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.309190 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92rb8\" (UniqueName: \"kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8\") pod \"ceilometer-0\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.407098 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:24:50 crc kubenswrapper[4778]: I0318 09:24:50.949951 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:24:50 crc kubenswrapper[4778]: W0318 09:24:50.954520 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1c91e10_5caf_4f06_89bb_c9dacc92ecef.slice/crio-f02f2bc76544bcc97a0d67474e80bad569ca14d09e55a75f25e0c054c4e8be41 WatchSource:0}: Error finding container f02f2bc76544bcc97a0d67474e80bad569ca14d09e55a75f25e0c054c4e8be41: Status 404 returned error can't find the container with id f02f2bc76544bcc97a0d67474e80bad569ca14d09e55a75f25e0c054c4e8be41 Mar 18 09:24:51 crc kubenswrapper[4778]: I0318 09:24:51.692474 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerStarted","Data":"f02f2bc76544bcc97a0d67474e80bad569ca14d09e55a75f25e0c054c4e8be41"} Mar 18 09:24:52 crc kubenswrapper[4778]: I0318 09:24:52.705761 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerStarted","Data":"5bf367bc7ca1760e31682d105fada5e12302e00c9e7370ed60b5dbbd5a0d181b"} Mar 18 09:24:52 crc kubenswrapper[4778]: I0318 09:24:52.706185 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerStarted","Data":"c9b5655db1df35344b5ba27e4f2b48e2b1e3b0ed8aae0f66ceb4cd01dacb6163"} Mar 18 09:24:56 crc kubenswrapper[4778]: I0318 09:24:56.744164 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" event={"ID":"8348daa3-112d-49f7-93d8-3649ebf10eee","Type":"ContainerStarted","Data":"aed5c2d54c93258cf5753b658cc8a1430cb39fb4faca41384d49dcc12f51df37"} Mar 18 09:24:56 crc kubenswrapper[4778]: I0318 09:24:56.772562 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" podStartSLOduration=1.707374648 podStartE2EDuration="8.772535808s" podCreationTimestamp="2026-03-18 09:24:48 +0000 UTC" firstStartedPulling="2026-03-18 09:24:49.48094113 +0000 UTC m=+1356.055685980" lastFinishedPulling="2026-03-18 09:24:56.5461023 +0000 UTC m=+1363.120847140" observedRunningTime="2026-03-18 09:24:56.763550923 +0000 UTC m=+1363.338295803" watchObservedRunningTime="2026-03-18 09:24:56.772535808 +0000 UTC m=+1363.347280678" Mar 18 09:24:57 crc kubenswrapper[4778]: I0318 09:24:57.774113 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerStarted","Data":"5aea936acc296e6d46f72dc7943fff348d82ba9d5c552a78ef0902e78f275280"} Mar 18 09:24:58 crc kubenswrapper[4778]: I0318 09:24:58.788887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerStarted","Data":"7e71673254bb31ec9a40bba9a74a5d569998bf03658635f51dde9500e1435648"} Mar 18 09:24:58 crc kubenswrapper[4778]: I0318 09:24:58.789372 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:24:58 crc kubenswrapper[4778]: I0318 09:24:58.825798 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.543137004 podStartE2EDuration="8.825773666s" podCreationTimestamp="2026-03-18 09:24:50 +0000 UTC" firstStartedPulling="2026-03-18 09:24:50.958376055 +0000 UTC m=+1357.533120915" lastFinishedPulling="2026-03-18 09:24:58.241012737 +0000 UTC m=+1364.815757577" observedRunningTime="2026-03-18 09:24:58.813185503 +0000 UTC m=+1365.387930383" watchObservedRunningTime="2026-03-18 09:24:58.825773666 +0000 UTC m=+1365.400518546" Mar 18 09:25:09 crc kubenswrapper[4778]: I0318 09:25:09.072265 4778 trace.go:236] Trace[1554972338]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-8ch6j" (18-Mar-2026 09:25:07.742) (total time: 1329ms): Mar 18 09:25:09 crc kubenswrapper[4778]: Trace[1554972338]: [1.329360951s] [1.329360951s] END Mar 18 09:25:10 crc kubenswrapper[4778]: I0318 09:25:10.082232 4778 generic.go:334] "Generic (PLEG): container finished" podID="8348daa3-112d-49f7-93d8-3649ebf10eee" containerID="aed5c2d54c93258cf5753b658cc8a1430cb39fb4faca41384d49dcc12f51df37" exitCode=0 Mar 18 09:25:10 crc kubenswrapper[4778]: I0318 09:25:10.082288 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" event={"ID":"8348daa3-112d-49f7-93d8-3649ebf10eee","Type":"ContainerDied","Data":"aed5c2d54c93258cf5753b658cc8a1430cb39fb4faca41384d49dcc12f51df37"} Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.471424 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.577467 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle\") pod \"8348daa3-112d-49f7-93d8-3649ebf10eee\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.577621 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvzwc\" (UniqueName: \"kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc\") pod \"8348daa3-112d-49f7-93d8-3649ebf10eee\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.577704 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data\") pod \"8348daa3-112d-49f7-93d8-3649ebf10eee\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.577885 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts\") pod \"8348daa3-112d-49f7-93d8-3649ebf10eee\" (UID: \"8348daa3-112d-49f7-93d8-3649ebf10eee\") " Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.583618 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts" (OuterVolumeSpecName: "scripts") pod "8348daa3-112d-49f7-93d8-3649ebf10eee" (UID: "8348daa3-112d-49f7-93d8-3649ebf10eee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.585438 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc" (OuterVolumeSpecName: "kube-api-access-cvzwc") pod "8348daa3-112d-49f7-93d8-3649ebf10eee" (UID: "8348daa3-112d-49f7-93d8-3649ebf10eee"). InnerVolumeSpecName "kube-api-access-cvzwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.611708 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8348daa3-112d-49f7-93d8-3649ebf10eee" (UID: "8348daa3-112d-49f7-93d8-3649ebf10eee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.617507 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data" (OuterVolumeSpecName: "config-data") pod "8348daa3-112d-49f7-93d8-3649ebf10eee" (UID: "8348daa3-112d-49f7-93d8-3649ebf10eee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.681219 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.681306 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.681337 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvzwc\" (UniqueName: \"kubernetes.io/projected/8348daa3-112d-49f7-93d8-3649ebf10eee-kube-api-access-cvzwc\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:11 crc kubenswrapper[4778]: I0318 09:25:11.681358 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8348daa3-112d-49f7-93d8-3649ebf10eee-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.120074 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" event={"ID":"8348daa3-112d-49f7-93d8-3649ebf10eee","Type":"ContainerDied","Data":"f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f"} Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.120892 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5bd2fabe74afe68d7b89edc3d7934e5f231bf730e99188b91dab652b2d5599f" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.121090 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vhxwz" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.274672 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:25:12 crc kubenswrapper[4778]: E0318 09:25:12.275300 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8348daa3-112d-49f7-93d8-3649ebf10eee" containerName="nova-cell0-conductor-db-sync" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.275392 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8348daa3-112d-49f7-93d8-3649ebf10eee" containerName="nova-cell0-conductor-db-sync" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.275653 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8348daa3-112d-49f7-93d8-3649ebf10eee" containerName="nova-cell0-conductor-db-sync" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.276276 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.287227 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.289808 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.289884 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pgv4m" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.402507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.402626 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt5w5\" (UniqueName: \"kubernetes.io/projected/3fc908a0-dc90-4df9-869c-5c0820cac423-kube-api-access-gt5w5\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.402935 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.504355 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.504405 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt5w5\" (UniqueName: \"kubernetes.io/projected/3fc908a0-dc90-4df9-869c-5c0820cac423-kube-api-access-gt5w5\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.504510 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.508744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.511133 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc908a0-dc90-4df9-869c-5c0820cac423-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.530097 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt5w5\" (UniqueName: \"kubernetes.io/projected/3fc908a0-dc90-4df9-869c-5c0820cac423-kube-api-access-gt5w5\") pod \"nova-cell0-conductor-0\" (UID: \"3fc908a0-dc90-4df9-869c-5c0820cac423\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:12 crc kubenswrapper[4778]: I0318 09:25:12.602700 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:13 crc kubenswrapper[4778]: I0318 09:25:13.134230 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:25:13 crc kubenswrapper[4778]: W0318 09:25:13.136781 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fc908a0_dc90_4df9_869c_5c0820cac423.slice/crio-eef00d170f457911eaa2258c8ba86e0253ab6e40474d31b674f9d76ea4b4194a WatchSource:0}: Error finding container eef00d170f457911eaa2258c8ba86e0253ab6e40474d31b674f9d76ea4b4194a: Status 404 returned error can't find the container with id eef00d170f457911eaa2258c8ba86e0253ab6e40474d31b674f9d76ea4b4194a Mar 18 09:25:14 crc kubenswrapper[4778]: I0318 09:25:14.147734 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3fc908a0-dc90-4df9-869c-5c0820cac423","Type":"ContainerStarted","Data":"d54334f64ea5e42752a6984d065a46f7c7ff4fcec00760b8bfa17fb3f3750ce7"} Mar 18 09:25:14 crc kubenswrapper[4778]: I0318 09:25:14.148360 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3fc908a0-dc90-4df9-869c-5c0820cac423","Type":"ContainerStarted","Data":"eef00d170f457911eaa2258c8ba86e0253ab6e40474d31b674f9d76ea4b4194a"} Mar 18 09:25:14 crc kubenswrapper[4778]: I0318 09:25:14.148712 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:14 crc kubenswrapper[4778]: I0318 09:25:14.198067 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.198010973 podStartE2EDuration="2.198010973s" podCreationTimestamp="2026-03-18 09:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:14.177904046 +0000 UTC m=+1380.752648966" watchObservedRunningTime="2026-03-18 09:25:14.198010973 +0000 UTC m=+1380.772755853" Mar 18 09:25:20 crc kubenswrapper[4778]: I0318 09:25:20.413442 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 09:25:22 crc kubenswrapper[4778]: I0318 09:25:22.630268 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.224745 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mjs29"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.226657 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.229808 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.229937 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.243688 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mjs29"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.327678 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.327730 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.327759 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.327827 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrl8g\" (UniqueName: \"kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.413249 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.415878 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.430799 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.430893 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrl8g\" (UniqueName: \"kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.430982 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.431014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.433662 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.437883 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.441838 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.462469 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.464010 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.474776 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrl8g\" (UniqueName: \"kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g\") pod \"nova-cell0-cell-mapping-mjs29\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.537537 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.537635 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.537678 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rlzt\" (UniqueName: \"kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.537727 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.553878 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.599406 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.601938 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.605755 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.614097 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.655526 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.655584 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rlzt\" (UniqueName: \"kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.655626 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.655691 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.656095 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.663286 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.664752 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.669279 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.676083 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.677480 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.678602 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.709176 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.718849 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.760990 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761036 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761106 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9vgw\" (UniqueName: \"kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761146 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761179 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761231 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761251 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vww5\" (UniqueName: \"kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761289 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761316 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sc9l\" (UniqueName: \"kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761334 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.761368 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.796466 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rlzt\" (UniqueName: \"kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt\") pod \"nova-api-0\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.812034 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.843274 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.854153 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864184 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9vgw\" (UniqueName: \"kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864247 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864288 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864325 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864343 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vww5\" (UniqueName: \"kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864367 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sc9l\" (UniqueName: \"kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864418 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864486 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864511 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.864527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.866293 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.866540 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.877527 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.877789 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.879051 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.880907 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.884807 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.890421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.891349 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.896015 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.900186 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.910880 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9vgw\" (UniqueName: \"kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw\") pod \"dnsmasq-dns-8b8cf6657-f44bz\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.925864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sc9l\" (UniqueName: \"kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l\") pod \"nova-scheduler-0\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.926053 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vww5\" (UniqueName: \"kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5\") pod \"nova-metadata-0\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " pod="openstack/nova-metadata-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.967400 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bdst\" (UniqueName: \"kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.967546 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.967651 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:23 crc kubenswrapper[4778]: I0318 09:25:23.984889 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.069579 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.069710 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.069776 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bdst\" (UniqueName: \"kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.082996 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.090501 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.091567 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bdst\" (UniqueName: \"kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst\") pod \"nova-cell1-novncproxy-0\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.138513 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.151287 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.188682 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.188964 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.439425 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mjs29"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.606481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.619422 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jbjb9"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.621003 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.623572 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.626618 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.646467 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jbjb9"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.705089 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.784075 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.788419 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.788469 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.788508 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldpq2\" (UniqueName: \"kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.788809 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.791524 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:25:24 crc kubenswrapper[4778]: W0318 09:25:24.796324 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecb86d82_de0e_474c_9942_a8dff1f8739b.slice/crio-a9b184c2a1d038cecffbf8235c70616929fdbf84785e09acda62d2b9837c969e WatchSource:0}: Error finding container a9b184c2a1d038cecffbf8235c70616929fdbf84785e09acda62d2b9837c969e: Status 404 returned error can't find the container with id a9b184c2a1d038cecffbf8235c70616929fdbf84785e09acda62d2b9837c969e Mar 18 09:25:24 crc kubenswrapper[4778]: W0318 09:25:24.796645 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod326b0319_1314_4e0c_9e38_7f0358087107.slice/crio-c61d899fafa112db54c6c9a253fee0265e8eb8a1263e04ecef015a4a97256ebe WatchSource:0}: Error finding container c61d899fafa112db54c6c9a253fee0265e8eb8a1263e04ecef015a4a97256ebe: Status 404 returned error can't find the container with id c61d899fafa112db54c6c9a253fee0265e8eb8a1263e04ecef015a4a97256ebe Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.863457 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:25:24 crc kubenswrapper[4778]: W0318 09:25:24.863961 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81cc74f1_64bc_448f_9654_352927efbb4c.slice/crio-faecd33981e31af7bf2e1b8e7eefd8b08f1e463cade44ffd3aabfca3aa9b5d91 WatchSource:0}: Error finding container faecd33981e31af7bf2e1b8e7eefd8b08f1e463cade44ffd3aabfca3aa9b5d91: Status 404 returned error can't find the container with id faecd33981e31af7bf2e1b8e7eefd8b08f1e463cade44ffd3aabfca3aa9b5d91 Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.890010 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.890058 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.890095 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldpq2\" (UniqueName: \"kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.890170 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.894779 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.894809 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.895656 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:24 crc kubenswrapper[4778]: I0318 09:25:24.907433 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldpq2\" (UniqueName: \"kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2\") pod \"nova-cell1-conductor-db-sync-jbjb9\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.001686 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:25 crc kubenswrapper[4778]: E0318 09:25:25.123769 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecb86d82_de0e_474c_9942_a8dff1f8739b.slice/crio-conmon-fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6.scope\": RecentStats: unable to find data in memory cache]" Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.259283 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerStarted","Data":"b1ef1193696aa80f8e8dae0aeb903c8f66ca8c8c56434acbb7fe2f05ed91d084"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.264499 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mjs29" event={"ID":"b89439e3-a138-4aa8-98a4-2e23ce3819e0","Type":"ContainerStarted","Data":"462bde6149a0c02e0a81e9d8cf7097470bbd3546789cdc6d2d61c3177437187e"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.264553 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mjs29" event={"ID":"b89439e3-a138-4aa8-98a4-2e23ce3819e0","Type":"ContainerStarted","Data":"829ccaa44c975124734c6e174a8e09a9966fa6e5717d8a5911b2a376e783ecf2"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.267475 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerStarted","Data":"af02077ff8be3a9cb17ac9dfdbd7efd69c18c248340c749dae5c8e266151e304"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.271511 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81cc74f1-64bc-448f-9654-352927efbb4c","Type":"ContainerStarted","Data":"faecd33981e31af7bf2e1b8e7eefd8b08f1e463cade44ffd3aabfca3aa9b5d91"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.273426 4778 generic.go:334] "Generic (PLEG): container finished" podID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerID="fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6" exitCode=0 Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.273570 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" event={"ID":"ecb86d82-de0e-474c-9942-a8dff1f8739b","Type":"ContainerDied","Data":"fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.273609 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" event={"ID":"ecb86d82-de0e-474c-9942-a8dff1f8739b","Type":"ContainerStarted","Data":"a9b184c2a1d038cecffbf8235c70616929fdbf84785e09acda62d2b9837c969e"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.274692 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"326b0319-1314-4e0c-9e38-7f0358087107","Type":"ContainerStarted","Data":"c61d899fafa112db54c6c9a253fee0265e8eb8a1263e04ecef015a4a97256ebe"} Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.291698 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mjs29" podStartSLOduration=2.291673675 podStartE2EDuration="2.291673675s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:25.280708856 +0000 UTC m=+1391.855453686" watchObservedRunningTime="2026-03-18 09:25:25.291673675 +0000 UTC m=+1391.866418505" Mar 18 09:25:25 crc kubenswrapper[4778]: I0318 09:25:25.514396 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jbjb9"] Mar 18 09:25:25 crc kubenswrapper[4778]: W0318 09:25:25.518050 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode85d64a6_99af_4b66_9a60_cd6a046af840.slice/crio-9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92 WatchSource:0}: Error finding container 9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92: Status 404 returned error can't find the container with id 9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92 Mar 18 09:25:26 crc kubenswrapper[4778]: I0318 09:25:26.288786 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" event={"ID":"e85d64a6-99af-4b66-9a60-cd6a046af840","Type":"ContainerStarted","Data":"973e9f8f665d67a226625f9e044e9c18b31cbfecdc6ee8dcf02562081d63ced4"} Mar 18 09:25:26 crc kubenswrapper[4778]: I0318 09:25:26.289290 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" event={"ID":"e85d64a6-99af-4b66-9a60-cd6a046af840","Type":"ContainerStarted","Data":"9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92"} Mar 18 09:25:26 crc kubenswrapper[4778]: I0318 09:25:26.293304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" event={"ID":"ecb86d82-de0e-474c-9942-a8dff1f8739b","Type":"ContainerStarted","Data":"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e"} Mar 18 09:25:26 crc kubenswrapper[4778]: I0318 09:25:26.316609 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" podStartSLOduration=2.316583503 podStartE2EDuration="2.316583503s" podCreationTimestamp="2026-03-18 09:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:26.302585782 +0000 UTC m=+1392.877330622" watchObservedRunningTime="2026-03-18 09:25:26.316583503 +0000 UTC m=+1392.891328343" Mar 18 09:25:26 crc kubenswrapper[4778]: I0318 09:25:26.330348 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" podStartSLOduration=3.330329868 podStartE2EDuration="3.330329868s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:26.326203766 +0000 UTC m=+1392.900948626" watchObservedRunningTime="2026-03-18 09:25:26.330329868 +0000 UTC m=+1392.905074708" Mar 18 09:25:27 crc kubenswrapper[4778]: I0318 09:25:27.301711 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.203630 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.212370 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.314271 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerStarted","Data":"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1"} Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.316139 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"326b0319-1314-4e0c-9e38-7f0358087107","Type":"ContainerStarted","Data":"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc"} Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.318288 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerStarted","Data":"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8"} Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.343754 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.100716173 podStartE2EDuration="5.343730991s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="2026-03-18 09:25:24.802799739 +0000 UTC m=+1391.377544579" lastFinishedPulling="2026-03-18 09:25:27.045814527 +0000 UTC m=+1393.620559397" observedRunningTime="2026-03-18 09:25:28.338235451 +0000 UTC m=+1394.912980321" watchObservedRunningTime="2026-03-18 09:25:28.343730991 +0000 UTC m=+1394.918475841" Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.414963 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.415235 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" containerName="kube-state-metrics" containerID="cri-o://3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed" gracePeriod=30 Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.928260 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:25:28 crc kubenswrapper[4778]: I0318 09:25:28.991915 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xklgw\" (UniqueName: \"kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw\") pod \"45babbce-b5d2-4ad5-8bc2-a5047e777e8d\" (UID: \"45babbce-b5d2-4ad5-8bc2-a5047e777e8d\") " Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.000279 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw" (OuterVolumeSpecName: "kube-api-access-xklgw") pod "45babbce-b5d2-4ad5-8bc2-a5047e777e8d" (UID: "45babbce-b5d2-4ad5-8bc2-a5047e777e8d"). InnerVolumeSpecName "kube-api-access-xklgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.095251 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xklgw\" (UniqueName: \"kubernetes.io/projected/45babbce-b5d2-4ad5-8bc2-a5047e777e8d-kube-api-access-xklgw\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.152265 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.330387 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerStarted","Data":"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d"} Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.330393 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-log" containerID="cri-o://83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.330628 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-metadata" containerID="cri-o://fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.339462 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerStarted","Data":"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d"} Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.342422 4778 generic.go:334] "Generic (PLEG): container finished" podID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" containerID="3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed" exitCode=2 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.342499 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.342526 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45babbce-b5d2-4ad5-8bc2-a5047e777e8d","Type":"ContainerDied","Data":"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed"} Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.342591 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45babbce-b5d2-4ad5-8bc2-a5047e777e8d","Type":"ContainerDied","Data":"aea881ea048223a1a79ed7ce2d76ae73c7092387d05c9eaeaf63d09ce8b8e125"} Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.342615 4778 scope.go:117] "RemoveContainer" containerID="3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.345453 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81cc74f1-64bc-448f-9654-352927efbb4c","Type":"ContainerStarted","Data":"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5"} Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.345763 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="81cc74f1-64bc-448f-9654-352927efbb4c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.387458 4778 scope.go:117] "RemoveContainer" containerID="3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed" Mar 18 09:25:29 crc kubenswrapper[4778]: E0318 09:25:29.390042 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed\": container with ID starting with 3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed not found: ID does not exist" containerID="3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.390077 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed"} err="failed to get container status \"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed\": rpc error: code = NotFound desc = could not find container \"3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed\": container with ID starting with 3b0f869f293da64d2d6fbb7066eaa662d2ec1b6c2357b120a19c06427080c1ed not found: ID does not exist" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.392946 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.276761239 podStartE2EDuration="6.39293462s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="2026-03-18 09:25:24.867134112 +0000 UTC m=+1391.441878952" lastFinishedPulling="2026-03-18 09:25:27.983307493 +0000 UTC m=+1394.558052333" observedRunningTime="2026-03-18 09:25:29.387155883 +0000 UTC m=+1395.961900723" watchObservedRunningTime="2026-03-18 09:25:29.39293462 +0000 UTC m=+1395.967679460" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.395296 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.05511777 podStartE2EDuration="6.395287464s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="2026-03-18 09:25:24.707508293 +0000 UTC m=+1391.282253133" lastFinishedPulling="2026-03-18 09:25:27.047677957 +0000 UTC m=+1393.622422827" observedRunningTime="2026-03-18 09:25:29.363333654 +0000 UTC m=+1395.938078514" watchObservedRunningTime="2026-03-18 09:25:29.395287464 +0000 UTC m=+1395.970032304" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.419091 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.987817238 podStartE2EDuration="6.419068823s" podCreationTimestamp="2026-03-18 09:25:23 +0000 UTC" firstStartedPulling="2026-03-18 09:25:24.626461016 +0000 UTC m=+1391.201205856" lastFinishedPulling="2026-03-18 09:25:27.057712601 +0000 UTC m=+1393.632457441" observedRunningTime="2026-03-18 09:25:29.412980717 +0000 UTC m=+1395.987725557" watchObservedRunningTime="2026-03-18 09:25:29.419068823 +0000 UTC m=+1395.993813663" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.444297 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.448926 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.480886 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:29 crc kubenswrapper[4778]: E0318 09:25:29.481713 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" containerName="kube-state-metrics" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.481798 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" containerName="kube-state-metrics" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.482155 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" containerName="kube-state-metrics" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.483136 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.490869 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.491339 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.494686 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.505120 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6w8p\" (UniqueName: \"kubernetes.io/projected/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-api-access-r6w8p\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.505228 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.505272 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.505309 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.607361 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.607478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6w8p\" (UniqueName: \"kubernetes.io/projected/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-api-access-r6w8p\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.607524 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.607557 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.620515 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.622674 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.630460 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.633417 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6w8p\" (UniqueName: \"kubernetes.io/projected/1663e1b0-f9b0-4168-9386-abf2c1b56b43-kube-api-access-r6w8p\") pod \"kube-state-metrics-0\" (UID: \"1663e1b0-f9b0-4168-9386-abf2c1b56b43\") " pod="openstack/kube-state-metrics-0" Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.811305 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.811836 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-central-agent" containerID="cri-o://c9b5655db1df35344b5ba27e4f2b48e2b1e3b0ed8aae0f66ceb4cd01dacb6163" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.812662 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="proxy-httpd" containerID="cri-o://7e71673254bb31ec9a40bba9a74a5d569998bf03658635f51dde9500e1435648" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.812751 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="sg-core" containerID="cri-o://5aea936acc296e6d46f72dc7943fff348d82ba9d5c552a78ef0902e78f275280" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.812813 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-notification-agent" containerID="cri-o://5bf367bc7ca1760e31682d105fada5e12302e00c9e7370ed60b5dbbd5a0d181b" gracePeriod=30 Mar 18 09:25:29 crc kubenswrapper[4778]: I0318 09:25:29.823723 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.045310 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.134666 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data\") pod \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.134716 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vww5\" (UniqueName: \"kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5\") pod \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.134751 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle\") pod \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.134832 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs\") pod \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\" (UID: \"8dfe71cf-9dde-4056-a41b-a36c1773ace5\") " Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.135658 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs" (OuterVolumeSpecName: "logs") pod "8dfe71cf-9dde-4056-a41b-a36c1773ace5" (UID: "8dfe71cf-9dde-4056-a41b-a36c1773ace5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.151626 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5" (OuterVolumeSpecName: "kube-api-access-8vww5") pod "8dfe71cf-9dde-4056-a41b-a36c1773ace5" (UID: "8dfe71cf-9dde-4056-a41b-a36c1773ace5"). InnerVolumeSpecName "kube-api-access-8vww5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.153705 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.153739 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.175403 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dfe71cf-9dde-4056-a41b-a36c1773ace5" (UID: "8dfe71cf-9dde-4056-a41b-a36c1773ace5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.198765 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data" (OuterVolumeSpecName: "config-data") pod "8dfe71cf-9dde-4056-a41b-a36c1773ace5" (UID: "8dfe71cf-9dde-4056-a41b-a36c1773ace5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.203562 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45babbce-b5d2-4ad5-8bc2-a5047e777e8d" path="/var/lib/kubelet/pods/45babbce-b5d2-4ad5-8bc2-a5047e777e8d/volumes" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.237741 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.237779 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vww5\" (UniqueName: \"kubernetes.io/projected/8dfe71cf-9dde-4056-a41b-a36c1773ace5-kube-api-access-8vww5\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.237791 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe71cf-9dde-4056-a41b-a36c1773ace5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.237800 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dfe71cf-9dde-4056-a41b-a36c1773ace5-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.357409 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerID="7e71673254bb31ec9a40bba9a74a5d569998bf03658635f51dde9500e1435648" exitCode=0 Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.357439 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerID="5aea936acc296e6d46f72dc7943fff348d82ba9d5c552a78ef0902e78f275280" exitCode=2 Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.357500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerDied","Data":"7e71673254bb31ec9a40bba9a74a5d569998bf03658635f51dde9500e1435648"} Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.357569 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerDied","Data":"5aea936acc296e6d46f72dc7943fff348d82ba9d5c552a78ef0902e78f275280"} Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359584 4778 generic.go:334] "Generic (PLEG): container finished" podID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerID="fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" exitCode=0 Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359617 4778 generic.go:334] "Generic (PLEG): container finished" podID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerID="83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" exitCode=143 Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359724 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerDied","Data":"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d"} Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359800 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359803 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerDied","Data":"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8"} Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359885 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8dfe71cf-9dde-4056-a41b-a36c1773ace5","Type":"ContainerDied","Data":"b1ef1193696aa80f8e8dae0aeb903c8f66ca8c8c56434acbb7fe2f05ed91d084"} Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.359824 4778 scope.go:117] "RemoveContainer" containerID="fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.392868 4778 scope.go:117] "RemoveContainer" containerID="83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.396719 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.412500 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.429650 4778 scope.go:117] "RemoveContainer" containerID="fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" Mar 18 09:25:30 crc kubenswrapper[4778]: E0318 09:25:30.432644 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d\": container with ID starting with fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d not found: ID does not exist" containerID="fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.432695 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d"} err="failed to get container status \"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d\": rpc error: code = NotFound desc = could not find container \"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d\": container with ID starting with fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d not found: ID does not exist" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.432726 4778 scope.go:117] "RemoveContainer" containerID="83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" Mar 18 09:25:30 crc kubenswrapper[4778]: E0318 09:25:30.435757 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8\": container with ID starting with 83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8 not found: ID does not exist" containerID="83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.435798 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8"} err="failed to get container status \"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8\": rpc error: code = NotFound desc = could not find container \"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8\": container with ID starting with 83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8 not found: ID does not exist" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.435822 4778 scope.go:117] "RemoveContainer" containerID="fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.437435 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d"} err="failed to get container status \"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d\": rpc error: code = NotFound desc = could not find container \"fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d\": container with ID starting with fc6e51d95f42cccca19c487fe48e59d91910dd575166cd509b97ba71c55a017d not found: ID does not exist" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.437470 4778 scope.go:117] "RemoveContainer" containerID="83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.437702 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8"} err="failed to get container status \"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8\": rpc error: code = NotFound desc = could not find container \"83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8\": container with ID starting with 83e235c981a58dac78c8033433674bd58cb5446de93f504fe947e4c73ac199e8 not found: ID does not exist" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.438828 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:30 crc kubenswrapper[4778]: E0318 09:25:30.439280 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-log" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.439293 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-log" Mar 18 09:25:30 crc kubenswrapper[4778]: E0318 09:25:30.439313 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-metadata" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.439319 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-metadata" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.439491 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-metadata" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.439501 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" containerName="nova-metadata-log" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.440519 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.453734 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.453957 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.455965 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:30 crc kubenswrapper[4778]: W0318 09:25:30.482027 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1663e1b0_f9b0_4168_9386_abf2c1b56b43.slice/crio-4161803091eb245c5d7ade5e9e9721aecc3f9745f54b9772f1e7645c5b3b9e05 WatchSource:0}: Error finding container 4161803091eb245c5d7ade5e9e9721aecc3f9745f54b9772f1e7645c5b3b9e05: Status 404 returned error can't find the container with id 4161803091eb245c5d7ade5e9e9721aecc3f9745f54b9772f1e7645c5b3b9e05 Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.492372 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.544008 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.544094 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.544222 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.544313 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.544340 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54xfn\" (UniqueName: \"kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.645665 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.645769 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.645800 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54xfn\" (UniqueName: \"kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.645841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.645869 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.646386 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.652953 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.653877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.654636 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.673401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54xfn\" (UniqueName: \"kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn\") pod \"nova-metadata-0\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " pod="openstack/nova-metadata-0" Mar 18 09:25:30 crc kubenswrapper[4778]: I0318 09:25:30.818315 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.367705 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.380293 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerID="c9b5655db1df35344b5ba27e4f2b48e2b1e3b0ed8aae0f66ceb4cd01dacb6163" exitCode=0 Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.380389 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerDied","Data":"c9b5655db1df35344b5ba27e4f2b48e2b1e3b0ed8aae0f66ceb4cd01dacb6163"} Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.381556 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerStarted","Data":"d558c548a049774135a4b2809d2aa33c1a8e339abe7d548e3631a7e27ee2c8c4"} Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.383016 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1663e1b0-f9b0-4168-9386-abf2c1b56b43","Type":"ContainerStarted","Data":"083b98fb08e327ebd6216ea2b2ce175fbeb359e00c495b5782bf6e305325cab7"} Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.383063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1663e1b0-f9b0-4168-9386-abf2c1b56b43","Type":"ContainerStarted","Data":"4161803091eb245c5d7ade5e9e9721aecc3f9745f54b9772f1e7645c5b3b9e05"} Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.383345 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 09:25:31 crc kubenswrapper[4778]: I0318 09:25:31.417133 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.0401437590000002 podStartE2EDuration="2.417105327s" podCreationTimestamp="2026-03-18 09:25:29 +0000 UTC" firstStartedPulling="2026-03-18 09:25:30.487327361 +0000 UTC m=+1397.062072201" lastFinishedPulling="2026-03-18 09:25:30.864288909 +0000 UTC m=+1397.439033769" observedRunningTime="2026-03-18 09:25:31.403124796 +0000 UTC m=+1397.977869636" watchObservedRunningTime="2026-03-18 09:25:31.417105327 +0000 UTC m=+1397.991850167" Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.200915 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dfe71cf-9dde-4056-a41b-a36c1773ace5" path="/var/lib/kubelet/pods/8dfe71cf-9dde-4056-a41b-a36c1773ace5/volumes" Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.394165 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerStarted","Data":"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090"} Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.394250 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerStarted","Data":"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea"} Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.397637 4778 generic.go:334] "Generic (PLEG): container finished" podID="b89439e3-a138-4aa8-98a4-2e23ce3819e0" containerID="462bde6149a0c02e0a81e9d8cf7097470bbd3546789cdc6d2d61c3177437187e" exitCode=0 Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.398121 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mjs29" event={"ID":"b89439e3-a138-4aa8-98a4-2e23ce3819e0","Type":"ContainerDied","Data":"462bde6149a0c02e0a81e9d8cf7097470bbd3546789cdc6d2d61c3177437187e"} Mar 18 09:25:32 crc kubenswrapper[4778]: I0318 09:25:32.455716 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.455681008 podStartE2EDuration="2.455681008s" podCreationTimestamp="2026-03-18 09:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:32.440073642 +0000 UTC m=+1399.014818512" watchObservedRunningTime="2026-03-18 09:25:32.455681008 +0000 UTC m=+1399.030425858" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.416558 4778 generic.go:334] "Generic (PLEG): container finished" podID="e85d64a6-99af-4b66-9a60-cd6a046af840" containerID="973e9f8f665d67a226625f9e044e9c18b31cbfecdc6ee8dcf02562081d63ced4" exitCode=0 Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.416677 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" event={"ID":"e85d64a6-99af-4b66-9a60-cd6a046af840","Type":"ContainerDied","Data":"973e9f8f665d67a226625f9e044e9c18b31cbfecdc6ee8dcf02562081d63ced4"} Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.803747 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.820895 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrl8g\" (UniqueName: \"kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g\") pod \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.820953 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle\") pod \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.821051 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data\") pod \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.821083 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts\") pod \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\" (UID: \"b89439e3-a138-4aa8-98a4-2e23ce3819e0\") " Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.833057 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts" (OuterVolumeSpecName: "scripts") pod "b89439e3-a138-4aa8-98a4-2e23ce3819e0" (UID: "b89439e3-a138-4aa8-98a4-2e23ce3819e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.833496 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g" (OuterVolumeSpecName: "kube-api-access-wrl8g") pod "b89439e3-a138-4aa8-98a4-2e23ce3819e0" (UID: "b89439e3-a138-4aa8-98a4-2e23ce3819e0"). InnerVolumeSpecName "kube-api-access-wrl8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.879010 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b89439e3-a138-4aa8-98a4-2e23ce3819e0" (UID: "b89439e3-a138-4aa8-98a4-2e23ce3819e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.879652 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data" (OuterVolumeSpecName: "config-data") pod "b89439e3-a138-4aa8-98a4-2e23ce3819e0" (UID: "b89439e3-a138-4aa8-98a4-2e23ce3819e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.891892 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.892007 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.925920 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.925995 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.926030 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrl8g\" (UniqueName: \"kubernetes.io/projected/b89439e3-a138-4aa8-98a4-2e23ce3819e0-kube-api-access-wrl8g\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:33 crc kubenswrapper[4778]: I0318 09:25:33.926060 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89439e3-a138-4aa8-98a4-2e23ce3819e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.152760 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.201091 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.201144 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.205758 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.343097 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.343403 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="dnsmasq-dns" containerID="cri-o://623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42" gracePeriod=10 Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.430533 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mjs29" event={"ID":"b89439e3-a138-4aa8-98a4-2e23ce3819e0","Type":"ContainerDied","Data":"829ccaa44c975124734c6e174a8e09a9966fa6e5717d8a5911b2a376e783ecf2"} Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.430633 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mjs29" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.431003 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="829ccaa44c975124734c6e174a8e09a9966fa6e5717d8a5911b2a376e783ecf2" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.520141 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.652854 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.677554 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.677830 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-log" containerID="cri-o://7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" gracePeriod=30 Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.678351 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-metadata" containerID="cri-o://29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" gracePeriod=30 Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.974493 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:34 crc kubenswrapper[4778]: I0318 09:25:34.975022 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.129614 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.154260 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.176946 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360025 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data\") pod \"e85d64a6-99af-4b66-9a60-cd6a046af840\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360109 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle\") pod \"e85d64a6-99af-4b66-9a60-cd6a046af840\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360255 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc\") pod \"06a6d934-2f47-4628-a328-6ba9cefb8090\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360294 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb\") pod \"06a6d934-2f47-4628-a328-6ba9cefb8090\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb\") pod \"06a6d934-2f47-4628-a328-6ba9cefb8090\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360468 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts\") pod \"e85d64a6-99af-4b66-9a60-cd6a046af840\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360622 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config\") pod \"06a6d934-2f47-4628-a328-6ba9cefb8090\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360743 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hck6z\" (UniqueName: \"kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z\") pod \"06a6d934-2f47-4628-a328-6ba9cefb8090\" (UID: \"06a6d934-2f47-4628-a328-6ba9cefb8090\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.360775 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldpq2\" (UniqueName: \"kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2\") pod \"e85d64a6-99af-4b66-9a60-cd6a046af840\" (UID: \"e85d64a6-99af-4b66-9a60-cd6a046af840\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.396499 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts" (OuterVolumeSpecName: "scripts") pod "e85d64a6-99af-4b66-9a60-cd6a046af840" (UID: "e85d64a6-99af-4b66-9a60-cd6a046af840"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.396534 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z" (OuterVolumeSpecName: "kube-api-access-hck6z") pod "06a6d934-2f47-4628-a328-6ba9cefb8090" (UID: "06a6d934-2f47-4628-a328-6ba9cefb8090"). InnerVolumeSpecName "kube-api-access-hck6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.396607 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2" (OuterVolumeSpecName: "kube-api-access-ldpq2") pod "e85d64a6-99af-4b66-9a60-cd6a046af840" (UID: "e85d64a6-99af-4b66-9a60-cd6a046af840"). InnerVolumeSpecName "kube-api-access-ldpq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.416068 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data" (OuterVolumeSpecName: "config-data") pod "e85d64a6-99af-4b66-9a60-cd6a046af840" (UID: "e85d64a6-99af-4b66-9a60-cd6a046af840"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.419559 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.432345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e85d64a6-99af-4b66-9a60-cd6a046af840" (UID: "e85d64a6-99af-4b66-9a60-cd6a046af840"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.466183 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06a6d934-2f47-4628-a328-6ba9cefb8090" (UID: "06a6d934-2f47-4628-a328-6ba9cefb8090"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467459 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467493 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hck6z\" (UniqueName: \"kubernetes.io/projected/06a6d934-2f47-4628-a328-6ba9cefb8090-kube-api-access-hck6z\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467506 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldpq2\" (UniqueName: \"kubernetes.io/projected/e85d64a6-99af-4b66-9a60-cd6a046af840-kube-api-access-ldpq2\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467516 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467525 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85d64a6-99af-4b66-9a60-cd6a046af840-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.467533 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.472894 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06a6d934-2f47-4628-a328-6ba9cefb8090" (UID: "06a6d934-2f47-4628-a328-6ba9cefb8090"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.474489 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerID="5bf367bc7ca1760e31682d105fada5e12302e00c9e7370ed60b5dbbd5a0d181b" exitCode=0 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.474570 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerDied","Data":"5bf367bc7ca1760e31682d105fada5e12302e00c9e7370ed60b5dbbd5a0d181b"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477399 4778 generic.go:334] "Generic (PLEG): container finished" podID="426016e7-8d14-4511-b963-528b9f54a8d1" containerID="29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" exitCode=0 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477422 4778 generic.go:334] "Generic (PLEG): container finished" podID="426016e7-8d14-4511-b963-528b9f54a8d1" containerID="7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" exitCode=143 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477464 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerDied","Data":"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477482 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerDied","Data":"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"426016e7-8d14-4511-b963-528b9f54a8d1","Type":"ContainerDied","Data":"d558c548a049774135a4b2809d2aa33c1a8e339abe7d548e3631a7e27ee2c8c4"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477516 4778 scope.go:117] "RemoveContainer" containerID="29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.477670 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.479707 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config" (OuterVolumeSpecName: "config") pod "06a6d934-2f47-4628-a328-6ba9cefb8090" (UID: "06a6d934-2f47-4628-a328-6ba9cefb8090"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.483639 4778 generic.go:334] "Generic (PLEG): container finished" podID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerID="623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42" exitCode=0 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.483715 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" event={"ID":"06a6d934-2f47-4628-a328-6ba9cefb8090","Type":"ContainerDied","Data":"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.483811 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" event={"ID":"06a6d934-2f47-4628-a328-6ba9cefb8090","Type":"ContainerDied","Data":"ff4d7228da7d0afd7376861dde18209e6f8e60848f76f48755825c7cbc2d2227"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.483920 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-9xln9" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.486352 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-log" containerID="cri-o://3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1" gracePeriod=30 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.486718 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.495119 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jbjb9" event={"ID":"e85d64a6-99af-4b66-9a60-cd6a046af840","Type":"ContainerDied","Data":"9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92"} Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.495144 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9118c3ac4661b3103a217848a93eccdeeeb28befd44dfdb72aa05904c8bd2a92" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.495397 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-api" containerID="cri-o://db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d" gracePeriod=30 Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.507626 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06a6d934-2f47-4628-a328-6ba9cefb8090" (UID: "06a6d934-2f47-4628-a328-6ba9cefb8090"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.533219 4778 scope.go:117] "RemoveContainer" containerID="7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564336 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564764 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85d64a6-99af-4b66-9a60-cd6a046af840" containerName="nova-cell1-conductor-db-sync" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564778 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85d64a6-99af-4b66-9a60-cd6a046af840" containerName="nova-cell1-conductor-db-sync" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564799 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-metadata" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564806 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-metadata" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564822 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89439e3-a138-4aa8-98a4-2e23ce3819e0" containerName="nova-manage" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564828 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89439e3-a138-4aa8-98a4-2e23ce3819e0" containerName="nova-manage" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564840 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="init" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564845 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="init" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564859 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="dnsmasq-dns" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564865 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="dnsmasq-dns" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.564873 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-log" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.564879 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-log" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565061 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-log" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565072 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" containerName="nova-metadata-metadata" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565079 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85d64a6-99af-4b66-9a60-cd6a046af840" containerName="nova-cell1-conductor-db-sync" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565093 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" containerName="dnsmasq-dns" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565104 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89439e3-a138-4aa8-98a4-2e23ce3819e0" containerName="nova-manage" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565700 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.565812 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.568919 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data\") pod \"426016e7-8d14-4511-b963-528b9f54a8d1\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569012 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs\") pod \"426016e7-8d14-4511-b963-528b9f54a8d1\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569111 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle\") pod \"426016e7-8d14-4511-b963-528b9f54a8d1\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569138 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569188 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs\") pod \"426016e7-8d14-4511-b963-528b9f54a8d1\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569298 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54xfn\" (UniqueName: \"kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn\") pod \"426016e7-8d14-4511-b963-528b9f54a8d1\" (UID: \"426016e7-8d14-4511-b963-528b9f54a8d1\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569720 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569738 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569748 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a6d934-2f47-4628-a328-6ba9cefb8090-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.569714 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs" (OuterVolumeSpecName: "logs") pod "426016e7-8d14-4511-b963-528b9f54a8d1" (UID: "426016e7-8d14-4511-b963-528b9f54a8d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.582088 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn" (OuterVolumeSpecName: "kube-api-access-54xfn") pod "426016e7-8d14-4511-b963-528b9f54a8d1" (UID: "426016e7-8d14-4511-b963-528b9f54a8d1"). InnerVolumeSpecName "kube-api-access-54xfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.606389 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data" (OuterVolumeSpecName: "config-data") pod "426016e7-8d14-4511-b963-528b9f54a8d1" (UID: "426016e7-8d14-4511-b963-528b9f54a8d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.626598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "426016e7-8d14-4511-b963-528b9f54a8d1" (UID: "426016e7-8d14-4511-b963-528b9f54a8d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.637966 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "426016e7-8d14-4511-b963-528b9f54a8d1" (UID: "426016e7-8d14-4511-b963-528b9f54a8d1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670719 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdgh6\" (UniqueName: \"kubernetes.io/projected/9ba2a389-4009-4dab-bc75-45a574e50bbc-kube-api-access-bdgh6\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670780 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670887 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670949 4778 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670967 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54xfn\" (UniqueName: \"kubernetes.io/projected/426016e7-8d14-4511-b963-528b9f54a8d1-kube-api-access-54xfn\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670977 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670986 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/426016e7-8d14-4511-b963-528b9f54a8d1-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.670995 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426016e7-8d14-4511-b963-528b9f54a8d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.671740 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.678904 4778 scope.go:117] "RemoveContainer" containerID="29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.679496 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090\": container with ID starting with 29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090 not found: ID does not exist" containerID="29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.679568 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090"} err="failed to get container status \"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090\": rpc error: code = NotFound desc = could not find container \"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090\": container with ID starting with 29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090 not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.679608 4778 scope.go:117] "RemoveContainer" containerID="7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.679994 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea\": container with ID starting with 7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea not found: ID does not exist" containerID="7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.680043 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea"} err="failed to get container status \"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea\": rpc error: code = NotFound desc = could not find container \"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea\": container with ID starting with 7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.680074 4778 scope.go:117] "RemoveContainer" containerID="29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.681891 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090"} err="failed to get container status \"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090\": rpc error: code = NotFound desc = could not find container \"29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090\": container with ID starting with 29e22e5ba1d613da524821cf410534247fa35aff576f8a34993d08ca5ee1c090 not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.681918 4778 scope.go:117] "RemoveContainer" containerID="7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.682228 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea"} err="failed to get container status \"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea\": rpc error: code = NotFound desc = could not find container \"7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea\": container with ID starting with 7fa4a4f46aefa7370012570f258fddebeff14954fd31e8d2f414ddfc3b85a9ea not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.682257 4778 scope.go:117] "RemoveContainer" containerID="623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.733075 4778 scope.go:117] "RemoveContainer" containerID="9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773663 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773761 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773784 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773854 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773879 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92rb8\" (UniqueName: \"kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.773968 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.774011 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data\") pod \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\" (UID: \"d1c91e10-5caf-4f06-89bb-c9dacc92ecef\") " Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.774337 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.774448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdgh6\" (UniqueName: \"kubernetes.io/projected/9ba2a389-4009-4dab-bc75-45a574e50bbc-kube-api-access-bdgh6\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.774494 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.775529 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.775592 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.779832 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8" (OuterVolumeSpecName: "kube-api-access-92rb8") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "kube-api-access-92rb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.779976 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts" (OuterVolumeSpecName: "scripts") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.782584 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.790969 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba2a389-4009-4dab-bc75-45a574e50bbc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.800244 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdgh6\" (UniqueName: \"kubernetes.io/projected/9ba2a389-4009-4dab-bc75-45a574e50bbc-kube-api-access-bdgh6\") pod \"nova-cell1-conductor-0\" (UID: \"9ba2a389-4009-4dab-bc75-45a574e50bbc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.808715 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.870409 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877723 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877747 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877783 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877793 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877802 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92rb8\" (UniqueName: \"kubernetes.io/projected/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-kube-api-access-92rb8\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.877812 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.924292 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data" (OuterVolumeSpecName: "config-data") pod "d1c91e10-5caf-4f06-89bb-c9dacc92ecef" (UID: "d1c91e10-5caf-4f06-89bb-c9dacc92ecef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.979888 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c91e10-5caf-4f06-89bb-c9dacc92ecef-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.984997 4778 scope.go:117] "RemoveContainer" containerID="623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.985423 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42\": container with ID starting with 623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42 not found: ID does not exist" containerID="623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.985491 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42"} err="failed to get container status \"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42\": rpc error: code = NotFound desc = could not find container \"623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42\": container with ID starting with 623370dad38cfbb91304725a1cc29388ba04c1275ccbefcfa45f5ebd61ab7f42 not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.985549 4778 scope.go:117] "RemoveContainer" containerID="9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6" Mar 18 09:25:35 crc kubenswrapper[4778]: E0318 09:25:35.985876 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6\": container with ID starting with 9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6 not found: ID does not exist" containerID="9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.985905 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6"} err="failed to get container status \"9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6\": rpc error: code = NotFound desc = could not find container \"9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6\": container with ID starting with 9fbf810ce724087a2a375ddcba5620405d186ece5213d114d3c1ef2c5e5fc4e6 not found: ID does not exist" Mar 18 09:25:35 crc kubenswrapper[4778]: I0318 09:25:35.990824 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.014912 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.055349 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.079834 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: E0318 09:25:36.080390 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="proxy-httpd" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080411 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="proxy-httpd" Mar 18 09:25:36 crc kubenswrapper[4778]: E0318 09:25:36.080432 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-notification-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080441 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-notification-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: E0318 09:25:36.080465 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-central-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080476 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-central-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: E0318 09:25:36.080488 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="sg-core" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080494 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="sg-core" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080690 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="proxy-httpd" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080709 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-notification-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080715 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="ceilometer-central-agent" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.080731 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" containerName="sg-core" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.082426 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.087926 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.089166 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.094267 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.113674 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-9xln9"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.123876 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.187328 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ljtb\" (UniqueName: \"kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.187366 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.187425 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.187450 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.187494 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.202967 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a6d934-2f47-4628-a328-6ba9cefb8090" path="/var/lib/kubelet/pods/06a6d934-2f47-4628-a328-6ba9cefb8090/volumes" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.203720 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426016e7-8d14-4511-b963-528b9f54a8d1" path="/var/lib/kubelet/pods/426016e7-8d14-4511-b963-528b9f54a8d1/volumes" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.289510 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.289704 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.289761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.289870 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.290012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ljtb\" (UniqueName: \"kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.293664 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.303423 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.304491 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.306616 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.314346 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ljtb\" (UniqueName: \"kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb\") pod \"nova-metadata-0\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.403746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.504014 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.504509 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1c91e10-5caf-4f06-89bb-c9dacc92ecef","Type":"ContainerDied","Data":"f02f2bc76544bcc97a0d67474e80bad569ca14d09e55a75f25e0c054c4e8be41"} Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.504553 4778 scope.go:117] "RemoveContainer" containerID="7e71673254bb31ec9a40bba9a74a5d569998bf03658635f51dde9500e1435648" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.541657 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.566601 4778 generic.go:334] "Generic (PLEG): container finished" podID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerID="3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1" exitCode=143 Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.566910 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="326b0319-1314-4e0c-9e38-7f0358087107" containerName="nova-scheduler-scheduler" containerID="cri-o://4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" gracePeriod=30 Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.567377 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerDied","Data":"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1"} Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.594421 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.596951 4778 scope.go:117] "RemoveContainer" containerID="5aea936acc296e6d46f72dc7943fff348d82ba9d5c552a78ef0902e78f275280" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.649281 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.657760 4778 scope.go:117] "RemoveContainer" containerID="5bf367bc7ca1760e31682d105fada5e12302e00c9e7370ed60b5dbbd5a0d181b" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.663884 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.667275 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.669883 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.670967 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.671572 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.680317 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.702893 4778 scope.go:117] "RemoveContainer" containerID="c9b5655db1df35344b5ba27e4f2b48e2b1e3b0ed8aae0f66ceb4cd01dacb6163" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.747973 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748030 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml984\" (UniqueName: \"kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748055 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748137 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748156 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748243 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748312 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.748333 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.849540 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850253 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850282 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850321 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850358 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml984\" (UniqueName: \"kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850427 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.850853 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.852735 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.852826 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.858657 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.858669 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.861943 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.866523 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.875955 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml984\" (UniqueName: \"kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.877900 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " pod="openstack/ceilometer-0" Mar 18 09:25:36 crc kubenswrapper[4778]: I0318 09:25:36.918748 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.002307 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.530832 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:25:37 crc kubenswrapper[4778]: W0318 09:25:37.531870 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf997c05f_82b3_4d82_859d_b02f458e355d.slice/crio-24881499c24b431bd6cb35e91dd94903665892ef9cb2374a0f2fab97787e21e9 WatchSource:0}: Error finding container 24881499c24b431bd6cb35e91dd94903665892ef9cb2374a0f2fab97787e21e9: Status 404 returned error can't find the container with id 24881499c24b431bd6cb35e91dd94903665892ef9cb2374a0f2fab97787e21e9 Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.578283 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerStarted","Data":"24881499c24b431bd6cb35e91dd94903665892ef9cb2374a0f2fab97787e21e9"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.579566 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9ba2a389-4009-4dab-bc75-45a574e50bbc","Type":"ContainerStarted","Data":"5cf0eb007f476e62593e4a0a2cc9a2ff0d13c9d31495cd46cfc6ebd1df79f55e"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.579606 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9ba2a389-4009-4dab-bc75-45a574e50bbc","Type":"ContainerStarted","Data":"da1dc5fd19dc3a3a37b90396161aa1a4b3b7f5930176884f084e9573161a7b03"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.580778 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.582395 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerStarted","Data":"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.582437 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerStarted","Data":"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.582451 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerStarted","Data":"290b792dbc94b49540da6dec52821c0018cd2340a491c925285465d74334b24e"} Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.630466 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.630445654 podStartE2EDuration="2.630445654s" podCreationTimestamp="2026-03-18 09:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:37.605789932 +0000 UTC m=+1404.180534782" watchObservedRunningTime="2026-03-18 09:25:37.630445654 +0000 UTC m=+1404.205190494" Mar 18 09:25:37 crc kubenswrapper[4778]: I0318 09:25:37.636057 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.636040036 podStartE2EDuration="2.636040036s" podCreationTimestamp="2026-03-18 09:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:37.63106885 +0000 UTC m=+1404.205813710" watchObservedRunningTime="2026-03-18 09:25:37.636040036 +0000 UTC m=+1404.210784866" Mar 18 09:25:38 crc kubenswrapper[4778]: I0318 09:25:38.205785 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c91e10-5caf-4f06-89bb-c9dacc92ecef" path="/var/lib/kubelet/pods/d1c91e10-5caf-4f06-89bb-c9dacc92ecef/volumes" Mar 18 09:25:38 crc kubenswrapper[4778]: I0318 09:25:38.611906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerStarted","Data":"247ae67d75fa48d0cb6104fd6017725a72fc4e82ff55213dd5e0b293d531d756"} Mar 18 09:25:39 crc kubenswrapper[4778]: E0318 09:25:39.154555 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:25:39 crc kubenswrapper[4778]: E0318 09:25:39.159572 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:25:39 crc kubenswrapper[4778]: E0318 09:25:39.161306 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:25:39 crc kubenswrapper[4778]: E0318 09:25:39.161365 4778 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="326b0319-1314-4e0c-9e38-7f0358087107" containerName="nova-scheduler-scheduler" Mar 18 09:25:39 crc kubenswrapper[4778]: I0318 09:25:39.635767 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerStarted","Data":"dcd444b95c81fbfdea64f1fbc2bd6aead66b1e18e8638323c659ba0124ee650c"} Mar 18 09:25:39 crc kubenswrapper[4778]: I0318 09:25:39.852747 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 09:25:39 crc kubenswrapper[4778]: I0318 09:25:39.999601 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.026232 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data\") pod \"326b0319-1314-4e0c-9e38-7f0358087107\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.027012 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sc9l\" (UniqueName: \"kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l\") pod \"326b0319-1314-4e0c-9e38-7f0358087107\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.027082 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle\") pod \"326b0319-1314-4e0c-9e38-7f0358087107\" (UID: \"326b0319-1314-4e0c-9e38-7f0358087107\") " Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.033675 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l" (OuterVolumeSpecName: "kube-api-access-7sc9l") pod "326b0319-1314-4e0c-9e38-7f0358087107" (UID: "326b0319-1314-4e0c-9e38-7f0358087107"). InnerVolumeSpecName "kube-api-access-7sc9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.055362 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data" (OuterVolumeSpecName: "config-data") pod "326b0319-1314-4e0c-9e38-7f0358087107" (UID: "326b0319-1314-4e0c-9e38-7f0358087107"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.074050 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "326b0319-1314-4e0c-9e38-7f0358087107" (UID: "326b0319-1314-4e0c-9e38-7f0358087107"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.128874 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.128906 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326b0319-1314-4e0c-9e38-7f0358087107-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.128932 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sc9l\" (UniqueName: \"kubernetes.io/projected/326b0319-1314-4e0c-9e38-7f0358087107-kube-api-access-7sc9l\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.644864 4778 generic.go:334] "Generic (PLEG): container finished" podID="326b0319-1314-4e0c-9e38-7f0358087107" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" exitCode=0 Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.644895 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.644901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"326b0319-1314-4e0c-9e38-7f0358087107","Type":"ContainerDied","Data":"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc"} Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.644937 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"326b0319-1314-4e0c-9e38-7f0358087107","Type":"ContainerDied","Data":"c61d899fafa112db54c6c9a253fee0265e8eb8a1263e04ecef015a4a97256ebe"} Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.644956 4778 scope.go:117] "RemoveContainer" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.650191 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerStarted","Data":"a8ed834ba7a0f4ec475460749067a718c91fbf1875be425bac23470f710feed4"} Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.668481 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.670101 4778 scope.go:117] "RemoveContainer" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" Mar 18 09:25:40 crc kubenswrapper[4778]: E0318 09:25:40.670699 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc\": container with ID starting with 4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc not found: ID does not exist" containerID="4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.670764 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc"} err="failed to get container status \"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc\": rpc error: code = NotFound desc = could not find container \"4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc\": container with ID starting with 4ae779b77a6ac538e1a4085e8e70ecb3e17f2cdb19c34d3b9fbd7f027d3dc2fc not found: ID does not exist" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.693686 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.697649 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:40 crc kubenswrapper[4778]: E0318 09:25:40.698000 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="326b0319-1314-4e0c-9e38-7f0358087107" containerName="nova-scheduler-scheduler" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.698017 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="326b0319-1314-4e0c-9e38-7f0358087107" containerName="nova-scheduler-scheduler" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.698215 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="326b0319-1314-4e0c-9e38-7f0358087107" containerName="nova-scheduler-scheduler" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.698719 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.701115 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.706430 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.737121 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.737669 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.737702 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl9dz\" (UniqueName: \"kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.839811 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.839877 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl9dz\" (UniqueName: \"kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.839950 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.850704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.850992 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:40 crc kubenswrapper[4778]: I0318 09:25:40.858613 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl9dz\" (UniqueName: \"kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz\") pod \"nova-scheduler-0\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " pod="openstack/nova-scheduler-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.035434 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.413271 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.448673 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs\") pod \"827e3d5b-c1fe-4634-b819-4d816911b71e\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.448811 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle\") pod \"827e3d5b-c1fe-4634-b819-4d816911b71e\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.448868 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rlzt\" (UniqueName: \"kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt\") pod \"827e3d5b-c1fe-4634-b819-4d816911b71e\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.448985 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data\") pod \"827e3d5b-c1fe-4634-b819-4d816911b71e\" (UID: \"827e3d5b-c1fe-4634-b819-4d816911b71e\") " Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.451865 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs" (OuterVolumeSpecName: "logs") pod "827e3d5b-c1fe-4634-b819-4d816911b71e" (UID: "827e3d5b-c1fe-4634-b819-4d816911b71e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.455550 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt" (OuterVolumeSpecName: "kube-api-access-7rlzt") pod "827e3d5b-c1fe-4634-b819-4d816911b71e" (UID: "827e3d5b-c1fe-4634-b819-4d816911b71e"). InnerVolumeSpecName "kube-api-access-7rlzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.484855 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data" (OuterVolumeSpecName: "config-data") pod "827e3d5b-c1fe-4634-b819-4d816911b71e" (UID: "827e3d5b-c1fe-4634-b819-4d816911b71e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.486420 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "827e3d5b-c1fe-4634-b819-4d816911b71e" (UID: "827e3d5b-c1fe-4634-b819-4d816911b71e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.551040 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.551557 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/827e3d5b-c1fe-4634-b819-4d816911b71e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.551571 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827e3d5b-c1fe-4634-b819-4d816911b71e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.551585 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rlzt\" (UniqueName: \"kubernetes.io/projected/827e3d5b-c1fe-4634-b819-4d816911b71e-kube-api-access-7rlzt\") on node \"crc\" DevicePath \"\"" Mar 18 09:25:41 crc kubenswrapper[4778]: W0318 09:25:41.567409 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd2218f5_0310_4e4c_8edc_d13c25707ea5.slice/crio-b346fc3122696affb646e267887f25b4c0332b9b1cc46fa78a99f5366632ff21 WatchSource:0}: Error finding container b346fc3122696affb646e267887f25b4c0332b9b1cc46fa78a99f5366632ff21: Status 404 returned error can't find the container with id b346fc3122696affb646e267887f25b4c0332b9b1cc46fa78a99f5366632ff21 Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.568144 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.660752 4778 generic.go:334] "Generic (PLEG): container finished" podID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerID="db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d" exitCode=0 Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.660824 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerDied","Data":"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d"} Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.660858 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"827e3d5b-c1fe-4634-b819-4d816911b71e","Type":"ContainerDied","Data":"af02077ff8be3a9cb17ac9dfdbd7efd69c18c248340c749dae5c8e266151e304"} Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.660880 4778 scope.go:117] "RemoveContainer" containerID="db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.660984 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.670012 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd2218f5-0310-4e4c-8edc-d13c25707ea5","Type":"ContainerStarted","Data":"b346fc3122696affb646e267887f25b4c0332b9b1cc46fa78a99f5366632ff21"} Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.712062 4778 scope.go:117] "RemoveContainer" containerID="3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.720718 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.737695 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.751223 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:41 crc kubenswrapper[4778]: E0318 09:25:41.751632 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-log" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.751646 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-log" Mar 18 09:25:41 crc kubenswrapper[4778]: E0318 09:25:41.751666 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-api" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.751672 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-api" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.751865 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-api" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.751888 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" containerName="nova-api-log" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.752848 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.761544 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.762756 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.781593 4778 scope.go:117] "RemoveContainer" containerID="db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d" Mar 18 09:25:41 crc kubenswrapper[4778]: E0318 09:25:41.784636 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d\": container with ID starting with db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d not found: ID does not exist" containerID="db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.784684 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d"} err="failed to get container status \"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d\": rpc error: code = NotFound desc = could not find container \"db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d\": container with ID starting with db2c8994e26b533f46e509eb4133a0e90fbe3ef8ff474b01f7109100807bbc2d not found: ID does not exist" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.784717 4778 scope.go:117] "RemoveContainer" containerID="3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1" Mar 18 09:25:41 crc kubenswrapper[4778]: E0318 09:25:41.786135 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1\": container with ID starting with 3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1 not found: ID does not exist" containerID="3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.786222 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1"} err="failed to get container status \"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1\": rpc error: code = NotFound desc = could not find container \"3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1\": container with ID starting with 3b9a26f2a44f61625cd1e4f6b80e581f99ebb7a2b7f3ade3ec5f0985dff599c1 not found: ID does not exist" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.861145 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.861217 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.861348 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.861404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsn6r\" (UniqueName: \"kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.963346 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.963640 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.963911 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.964020 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsn6r\" (UniqueName: \"kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.963984 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.967453 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.968419 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:41 crc kubenswrapper[4778]: I0318 09:25:41.985421 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsn6r\" (UniqueName: \"kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r\") pod \"nova-api-0\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " pod="openstack/nova-api-0" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.076584 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.208615 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="326b0319-1314-4e0c-9e38-7f0358087107" path="/var/lib/kubelet/pods/326b0319-1314-4e0c-9e38-7f0358087107/volumes" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.209472 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="827e3d5b-c1fe-4634-b819-4d816911b71e" path="/var/lib/kubelet/pods/827e3d5b-c1fe-4634-b819-4d816911b71e/volumes" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.558386 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.685370 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerStarted","Data":"53356b73b893522d6b8ce2a575d851c14fe0eb313a03d20cf218a2982a049325"} Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.685529 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.689216 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerStarted","Data":"d464bbdf554e5888aa55da417c9822d9fe3e925deb0939352bfe349e89da555d"} Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.692228 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd2218f5-0310-4e4c-8edc-d13c25707ea5","Type":"ContainerStarted","Data":"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413"} Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.719106 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.31534946 podStartE2EDuration="6.719081964s" podCreationTimestamp="2026-03-18 09:25:36 +0000 UTC" firstStartedPulling="2026-03-18 09:25:37.535803475 +0000 UTC m=+1404.110548315" lastFinishedPulling="2026-03-18 09:25:41.939535979 +0000 UTC m=+1408.514280819" observedRunningTime="2026-03-18 09:25:42.704855146 +0000 UTC m=+1409.279600006" watchObservedRunningTime="2026-03-18 09:25:42.719081964 +0000 UTC m=+1409.293826804" Mar 18 09:25:42 crc kubenswrapper[4778]: I0318 09:25:42.738469 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7384498710000003 podStartE2EDuration="2.738449871s" podCreationTimestamp="2026-03-18 09:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:42.73067249 +0000 UTC m=+1409.305417350" watchObservedRunningTime="2026-03-18 09:25:42.738449871 +0000 UTC m=+1409.313194711" Mar 18 09:25:43 crc kubenswrapper[4778]: I0318 09:25:43.710590 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerStarted","Data":"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f"} Mar 18 09:25:43 crc kubenswrapper[4778]: I0318 09:25:43.711540 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerStarted","Data":"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38"} Mar 18 09:25:43 crc kubenswrapper[4778]: I0318 09:25:43.752610 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.752581135 podStartE2EDuration="2.752581135s" podCreationTimestamp="2026-03-18 09:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:25:43.737106264 +0000 UTC m=+1410.311851134" watchObservedRunningTime="2026-03-18 09:25:43.752581135 +0000 UTC m=+1410.327326025" Mar 18 09:25:46 crc kubenswrapper[4778]: I0318 09:25:46.036346 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 09:25:46 crc kubenswrapper[4778]: I0318 09:25:46.043871 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 09:25:46 crc kubenswrapper[4778]: I0318 09:25:46.404404 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 09:25:46 crc kubenswrapper[4778]: I0318 09:25:46.404530 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 09:25:47 crc kubenswrapper[4778]: I0318 09:25:47.425346 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:47 crc kubenswrapper[4778]: I0318 09:25:47.425404 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:51 crc kubenswrapper[4778]: I0318 09:25:51.036437 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 09:25:51 crc kubenswrapper[4778]: I0318 09:25:51.076625 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 09:25:51 crc kubenswrapper[4778]: I0318 09:25:51.850774 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 09:25:52 crc kubenswrapper[4778]: I0318 09:25:52.077035 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:25:52 crc kubenswrapper[4778]: I0318 09:25:52.077090 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:25:53 crc kubenswrapper[4778]: I0318 09:25:53.158413 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:53 crc kubenswrapper[4778]: I0318 09:25:53.158845 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 09:25:54 crc kubenswrapper[4778]: I0318 09:25:54.404300 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 09:25:54 crc kubenswrapper[4778]: I0318 09:25:54.405133 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 09:25:56 crc kubenswrapper[4778]: I0318 09:25:56.415349 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 09:25:56 crc kubenswrapper[4778]: I0318 09:25:56.417681 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 09:25:56 crc kubenswrapper[4778]: I0318 09:25:56.426632 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 09:25:56 crc kubenswrapper[4778]: I0318 09:25:56.427895 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.850618 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.901837 4778 generic.go:334] "Generic (PLEG): container finished" podID="81cc74f1-64bc-448f-9654-352927efbb4c" containerID="e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5" exitCode=137 Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.901875 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.901877 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81cc74f1-64bc-448f-9654-352927efbb4c","Type":"ContainerDied","Data":"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5"} Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.901986 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81cc74f1-64bc-448f-9654-352927efbb4c","Type":"ContainerDied","Data":"faecd33981e31af7bf2e1b8e7eefd8b08f1e463cade44ffd3aabfca3aa9b5d91"} Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.902006 4778 scope.go:117] "RemoveContainer" containerID="e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5" Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.935850 4778 scope.go:117] "RemoveContainer" containerID="e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5" Mar 18 09:25:59 crc kubenswrapper[4778]: E0318 09:25:59.936413 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5\": container with ID starting with e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5 not found: ID does not exist" containerID="e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5" Mar 18 09:25:59 crc kubenswrapper[4778]: I0318 09:25:59.936457 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5"} err="failed to get container status \"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5\": rpc error: code = NotFound desc = could not find container \"e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5\": container with ID starting with e607ebc683687e7dda2e94312a13a309e58e1ecafd648661f1c9fd8ec1e662e5 not found: ID does not exist" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.044919 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bdst\" (UniqueName: \"kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst\") pod \"81cc74f1-64bc-448f-9654-352927efbb4c\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.044966 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data\") pod \"81cc74f1-64bc-448f-9654-352927efbb4c\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.045022 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle\") pod \"81cc74f1-64bc-448f-9654-352927efbb4c\" (UID: \"81cc74f1-64bc-448f-9654-352927efbb4c\") " Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.062556 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst" (OuterVolumeSpecName: "kube-api-access-8bdst") pod "81cc74f1-64bc-448f-9654-352927efbb4c" (UID: "81cc74f1-64bc-448f-9654-352927efbb4c"). InnerVolumeSpecName "kube-api-access-8bdst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.077089 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.077161 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.083137 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data" (OuterVolumeSpecName: "config-data") pod "81cc74f1-64bc-448f-9654-352927efbb4c" (UID: "81cc74f1-64bc-448f-9654-352927efbb4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.085490 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81cc74f1-64bc-448f-9654-352927efbb4c" (UID: "81cc74f1-64bc-448f-9654-352927efbb4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.147782 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.147831 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.148296 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bdst\" (UniqueName: \"kubernetes.io/projected/81cc74f1-64bc-448f-9654-352927efbb4c-kube-api-access-8bdst\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.148328 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.148338 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cc74f1-64bc-448f-9654-352927efbb4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.148982 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563766-lqhxm"] Mar 18 09:26:00 crc kubenswrapper[4778]: E0318 09:26:00.149498 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81cc74f1-64bc-448f-9654-352927efbb4c" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.149521 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="81cc74f1-64bc-448f-9654-352927efbb4c" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.149791 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="81cc74f1-64bc-448f-9654-352927efbb4c" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.150576 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.153079 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.153182 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.153590 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.161849 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-lqhxm"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.247561 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.253368 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6djlr\" (UniqueName: \"kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr\") pod \"auto-csr-approver-29563766-lqhxm\" (UID: \"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa\") " pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.264366 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.302072 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.305218 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.308949 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.309145 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.309323 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.315901 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.355785 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6djlr\" (UniqueName: \"kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr\") pod \"auto-csr-approver-29563766-lqhxm\" (UID: \"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa\") " pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.378332 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6djlr\" (UniqueName: \"kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr\") pod \"auto-csr-approver-29563766-lqhxm\" (UID: \"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa\") " pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.457620 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.457935 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4k4t\" (UniqueName: \"kubernetes.io/projected/9549b39b-0fc5-4e89-b64a-de83c80735ed-kube-api-access-s4k4t\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.458022 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.458567 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.458616 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.471088 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.560220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.560282 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.560369 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.560394 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4k4t\" (UniqueName: \"kubernetes.io/projected/9549b39b-0fc5-4e89-b64a-de83c80735ed-kube-api-access-s4k4t\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.560467 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.567872 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.568937 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.595372 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.595401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9549b39b-0fc5-4e89-b64a-de83c80735ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.605458 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4k4t\" (UniqueName: \"kubernetes.io/projected/9549b39b-0fc5-4e89-b64a-de83c80735ed-kube-api-access-s4k4t\") pod \"nova-cell1-novncproxy-0\" (UID: \"9549b39b-0fc5-4e89-b64a-de83c80735ed\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.629712 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.783966 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-lqhxm"] Mar 18 09:26:00 crc kubenswrapper[4778]: I0318 09:26:00.911034 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" event={"ID":"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa","Type":"ContainerStarted","Data":"6321a216ea221ca591bab933c4080da4dbbc3abbc34ba412c96f71c792e428d0"} Mar 18 09:26:01 crc kubenswrapper[4778]: I0318 09:26:01.126993 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 09:26:01 crc kubenswrapper[4778]: W0318 09:26:01.132444 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9549b39b_0fc5_4e89_b64a_de83c80735ed.slice/crio-f9b4d34396e0c739602ca88c4a374662e0ebdb4e08eb1919eb24219a92e01561 WatchSource:0}: Error finding container f9b4d34396e0c739602ca88c4a374662e0ebdb4e08eb1919eb24219a92e01561: Status 404 returned error can't find the container with id f9b4d34396e0c739602ca88c4a374662e0ebdb4e08eb1919eb24219a92e01561 Mar 18 09:26:01 crc kubenswrapper[4778]: I0318 09:26:01.931790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9549b39b-0fc5-4e89-b64a-de83c80735ed","Type":"ContainerStarted","Data":"39b380408f9c28d831352c9df712478d58f8921a5d7729466c4061c2824af3fb"} Mar 18 09:26:01 crc kubenswrapper[4778]: I0318 09:26:01.932341 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9549b39b-0fc5-4e89-b64a-de83c80735ed","Type":"ContainerStarted","Data":"f9b4d34396e0c739602ca88c4a374662e0ebdb4e08eb1919eb24219a92e01561"} Mar 18 09:26:01 crc kubenswrapper[4778]: I0318 09:26:01.964790 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.964767641 podStartE2EDuration="1.964767641s" podCreationTimestamp="2026-03-18 09:26:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:01.95661504 +0000 UTC m=+1428.531359890" watchObservedRunningTime="2026-03-18 09:26:01.964767641 +0000 UTC m=+1428.539512481" Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.083110 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.085808 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.090951 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.200693 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81cc74f1-64bc-448f-9654-352927efbb4c" path="/var/lib/kubelet/pods/81cc74f1-64bc-448f-9654-352927efbb4c/volumes" Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.943299 4778 generic.go:334] "Generic (PLEG): container finished" podID="ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" containerID="c24773f49ad71f17b93d5ac7609065bb82dac185ae78530f1dcf0ecca87ade20" exitCode=0 Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.943385 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" event={"ID":"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa","Type":"ContainerDied","Data":"c24773f49ad71f17b93d5ac7609065bb82dac185ae78530f1dcf0ecca87ade20"} Mar 18 09:26:02 crc kubenswrapper[4778]: I0318 09:26:02.947948 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.152441 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.154313 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.164780 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.322207 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.322266 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.322318 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.322356 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.322686 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts5rq\" (UniqueName: \"kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.424574 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.424621 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.424649 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.424679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.424769 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts5rq\" (UniqueName: \"kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.425660 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.425660 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.425768 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.425877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.450989 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts5rq\" (UniqueName: \"kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq\") pod \"dnsmasq-dns-68d4b6d797-hzxbf\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.483520 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:03 crc kubenswrapper[4778]: I0318 09:26:03.942678 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.339999 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.445183 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6djlr\" (UniqueName: \"kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr\") pod \"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa\" (UID: \"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa\") " Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.450580 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr" (OuterVolumeSpecName: "kube-api-access-6djlr") pod "ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" (UID: "ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa"). InnerVolumeSpecName "kube-api-access-6djlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.547689 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6djlr\" (UniqueName: \"kubernetes.io/projected/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa-kube-api-access-6djlr\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.963128 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" event={"ID":"ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa","Type":"ContainerDied","Data":"6321a216ea221ca591bab933c4080da4dbbc3abbc34ba412c96f71c792e428d0"} Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.963450 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6321a216ea221ca591bab933c4080da4dbbc3abbc34ba412c96f71c792e428d0" Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.963173 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-lqhxm" Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.965041 4778 generic.go:334] "Generic (PLEG): container finished" podID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerID="bda48f5cd3722d61bee57d8aacf12b0b775a391dd2ad3e1c8999cc5bb624e15c" exitCode=0 Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.965259 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" event={"ID":"31b962b8-c7e1-495c-a52d-f4fb63e884ca","Type":"ContainerDied","Data":"bda48f5cd3722d61bee57d8aacf12b0b775a391dd2ad3e1c8999cc5bb624e15c"} Mar 18 09:26:04 crc kubenswrapper[4778]: I0318 09:26:04.965321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" event={"ID":"31b962b8-c7e1-495c-a52d-f4fb63e884ca","Type":"ContainerStarted","Data":"f1b064a501bbcb63fd4dc86edbad3f121ced055deab0d7f76a0a08acbb649b7a"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.346858 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.347350 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-central-agent" containerID="cri-o://247ae67d75fa48d0cb6104fd6017725a72fc4e82ff55213dd5e0b293d531d756" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.347515 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="proxy-httpd" containerID="cri-o://53356b73b893522d6b8ce2a575d851c14fe0eb313a03d20cf218a2982a049325" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.347587 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="sg-core" containerID="cri-o://a8ed834ba7a0f4ec475460749067a718c91fbf1875be425bac23470f710feed4" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.347646 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-notification-agent" containerID="cri-o://dcd444b95c81fbfdea64f1fbc2bd6aead66b1e18e8638323c659ba0124ee650c" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.357816 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.426255 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-nvkp2"] Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.433027 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-nvkp2"] Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.534334 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.630975 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978574 4778 generic.go:334] "Generic (PLEG): container finished" podID="f997c05f-82b3-4d82-859d-b02f458e355d" containerID="53356b73b893522d6b8ce2a575d851c14fe0eb313a03d20cf218a2982a049325" exitCode=0 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978627 4778 generic.go:334] "Generic (PLEG): container finished" podID="f997c05f-82b3-4d82-859d-b02f458e355d" containerID="a8ed834ba7a0f4ec475460749067a718c91fbf1875be425bac23470f710feed4" exitCode=2 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978637 4778 generic.go:334] "Generic (PLEG): container finished" podID="f997c05f-82b3-4d82-859d-b02f458e355d" containerID="dcd444b95c81fbfdea64f1fbc2bd6aead66b1e18e8638323c659ba0124ee650c" exitCode=0 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978647 4778 generic.go:334] "Generic (PLEG): container finished" podID="f997c05f-82b3-4d82-859d-b02f458e355d" containerID="247ae67d75fa48d0cb6104fd6017725a72fc4e82ff55213dd5e0b293d531d756" exitCode=0 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978680 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerDied","Data":"53356b73b893522d6b8ce2a575d851c14fe0eb313a03d20cf218a2982a049325"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978743 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerDied","Data":"a8ed834ba7a0f4ec475460749067a718c91fbf1875be425bac23470f710feed4"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerDied","Data":"dcd444b95c81fbfdea64f1fbc2bd6aead66b1e18e8638323c659ba0124ee650c"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.978774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerDied","Data":"247ae67d75fa48d0cb6104fd6017725a72fc4e82ff55213dd5e0b293d531d756"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.983012 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-log" containerID="cri-o://552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.983632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" event={"ID":"31b962b8-c7e1-495c-a52d-f4fb63e884ca","Type":"ContainerStarted","Data":"76b7c61aa2a7be4fd8a7e4d8e9e84698c3d85d0b2b4f2c826eef7fac6ce91214"} Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.983680 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-api" containerID="cri-o://416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f" gracePeriod=30 Mar 18 09:26:05 crc kubenswrapper[4778]: I0318 09:26:05.983782 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.027998 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" podStartSLOduration=3.027965389 podStartE2EDuration="3.027965389s" podCreationTimestamp="2026-03-18 09:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:06.00966012 +0000 UTC m=+1432.584404970" watchObservedRunningTime="2026-03-18 09:26:06.027965389 +0000 UTC m=+1432.602710229" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.165883 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.199706 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc3bf93e-1b00-4852-b69b-0c8d701f56e3" path="/var/lib/kubelet/pods/bc3bf93e-1b00-4852-b69b-0c8d701f56e3/volumes" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287715 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287762 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml984\" (UniqueName: \"kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287897 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287940 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.287997 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.288096 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.288124 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd\") pod \"f997c05f-82b3-4d82-859d-b02f458e355d\" (UID: \"f997c05f-82b3-4d82-859d-b02f458e355d\") " Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.288580 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.290306 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.290672 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.290741 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f997c05f-82b3-4d82-859d-b02f458e355d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.302486 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts" (OuterVolumeSpecName: "scripts") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.326538 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984" (OuterVolumeSpecName: "kube-api-access-ml984") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "kube-api-access-ml984". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.372631 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.393212 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.393779 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.393813 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml984\" (UniqueName: \"kubernetes.io/projected/f997c05f-82b3-4d82-859d-b02f458e355d-kube-api-access-ml984\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.421855 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.432787 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.458228 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data" (OuterVolumeSpecName: "config-data") pod "f997c05f-82b3-4d82-859d-b02f458e355d" (UID: "f997c05f-82b3-4d82-859d-b02f458e355d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.532735 4778 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.532784 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:06 crc kubenswrapper[4778]: I0318 09:26:06.532810 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f997c05f-82b3-4d82-859d-b02f458e355d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.000235 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f997c05f-82b3-4d82-859d-b02f458e355d","Type":"ContainerDied","Data":"24881499c24b431bd6cb35e91dd94903665892ef9cb2374a0f2fab97787e21e9"} Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.000332 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.000340 4778 scope.go:117] "RemoveContainer" containerID="53356b73b893522d6b8ce2a575d851c14fe0eb313a03d20cf218a2982a049325" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.002500 4778 generic.go:334] "Generic (PLEG): container finished" podID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerID="552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38" exitCode=143 Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.003528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerDied","Data":"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38"} Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.053956 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.069698 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.079843 4778 scope.go:117] "RemoveContainer" containerID="a8ed834ba7a0f4ec475460749067a718c91fbf1875be425bac23470f710feed4" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.086383 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:07 crc kubenswrapper[4778]: E0318 09:26:07.087184 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-central-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087229 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-central-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: E0318 09:26:07.087241 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" containerName="oc" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087307 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" containerName="oc" Mar 18 09:26:07 crc kubenswrapper[4778]: E0318 09:26:07.087321 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="proxy-httpd" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087331 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="proxy-httpd" Mar 18 09:26:07 crc kubenswrapper[4778]: E0318 09:26:07.087341 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-notification-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087347 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-notification-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: E0318 09:26:07.087426 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="sg-core" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087580 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="sg-core" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087807 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-notification-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087826 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" containerName="oc" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087836 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="proxy-httpd" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087855 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="ceilometer-central-agent" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.087861 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" containerName="sg-core" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.090782 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.093033 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.093316 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.094160 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.104268 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.134090 4778 scope.go:117] "RemoveContainer" containerID="dcd444b95c81fbfdea64f1fbc2bd6aead66b1e18e8638323c659ba0124ee650c" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.173047 4778 scope.go:117] "RemoveContainer" containerID="247ae67d75fa48d0cb6104fd6017725a72fc4e82ff55213dd5e0b293d531d756" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.245598 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns6zx\" (UniqueName: \"kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246182 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246261 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246283 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246335 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246382 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.246402 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348082 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348163 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348184 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348278 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348338 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348358 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348399 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns6zx\" (UniqueName: \"kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.348416 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.350345 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.351649 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.356261 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.356836 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.363330 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.364333 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.365095 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.371998 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns6zx\" (UniqueName: \"kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx\") pod \"ceilometer-0\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.426587 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:07 crc kubenswrapper[4778]: I0318 09:26:07.734572 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:08 crc kubenswrapper[4778]: I0318 09:26:08.134975 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:08 crc kubenswrapper[4778]: I0318 09:26:08.202120 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f997c05f-82b3-4d82-859d-b02f458e355d" path="/var/lib/kubelet/pods/f997c05f-82b3-4d82-859d-b02f458e355d/volumes" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.029493 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerStarted","Data":"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5"} Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.030490 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerStarted","Data":"ba77b580ef06a1e2bd00469a95b5561856fad5eff21c63e51dda25d025a61698"} Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.535319 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.632071 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsn6r\" (UniqueName: \"kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r\") pod \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.632116 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data\") pod \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.632212 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs\") pod \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.632238 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle\") pod \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\" (UID: \"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68\") " Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.632911 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs" (OuterVolumeSpecName: "logs") pod "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" (UID: "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.640109 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r" (OuterVolumeSpecName: "kube-api-access-wsn6r") pod "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" (UID: "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68"). InnerVolumeSpecName "kube-api-access-wsn6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.666441 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data" (OuterVolumeSpecName: "config-data") pod "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" (UID: "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.672144 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" (UID: "fa8ef26d-e7ff-4de3-9812-42adb4cfeb68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.734417 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsn6r\" (UniqueName: \"kubernetes.io/projected/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-kube-api-access-wsn6r\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.734978 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.734996 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:09 crc kubenswrapper[4778]: I0318 09:26:09.735010 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.040361 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerStarted","Data":"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19"} Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.042717 4778 generic.go:334] "Generic (PLEG): container finished" podID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerID="416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f" exitCode=0 Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.042783 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerDied","Data":"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f"} Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.042814 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.042833 4778 scope.go:117] "RemoveContainer" containerID="416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.042819 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fa8ef26d-e7ff-4de3-9812-42adb4cfeb68","Type":"ContainerDied","Data":"d464bbdf554e5888aa55da417c9822d9fe3e925deb0939352bfe349e89da555d"} Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.074730 4778 scope.go:117] "RemoveContainer" containerID="552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.128910 4778 scope.go:117] "RemoveContainer" containerID="416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f" Mar 18 09:26:10 crc kubenswrapper[4778]: E0318 09:26:10.129456 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f\": container with ID starting with 416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f not found: ID does not exist" containerID="416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.129494 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f"} err="failed to get container status \"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f\": rpc error: code = NotFound desc = could not find container \"416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f\": container with ID starting with 416b1d567a8b036077bc3e6b8ea2da26c08946c339ab53e74248d96be489ea1f not found: ID does not exist" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.129518 4778 scope.go:117] "RemoveContainer" containerID="552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38" Mar 18 09:26:10 crc kubenswrapper[4778]: E0318 09:26:10.130035 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38\": container with ID starting with 552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38 not found: ID does not exist" containerID="552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.130057 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38"} err="failed to get container status \"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38\": rpc error: code = NotFound desc = could not find container \"552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38\": container with ID starting with 552bd7d5a7e13bee41e71d0bfa192f23170d0af5e9416342728572a1daff3e38 not found: ID does not exist" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.134301 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.150061 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.158275 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:10 crc kubenswrapper[4778]: E0318 09:26:10.158749 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-log" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.158769 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-log" Mar 18 09:26:10 crc kubenswrapper[4778]: E0318 09:26:10.158787 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-api" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.158794 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-api" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.158962 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-api" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.158984 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" containerName="nova-api-log" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.159922 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.163621 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.163862 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.164890 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.165931 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.204674 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa8ef26d-e7ff-4de3-9812-42adb4cfeb68" path="/var/lib/kubelet/pods/fa8ef26d-e7ff-4de3-9812-42adb4cfeb68/volumes" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.348650 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.348730 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.348768 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.349441 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.349600 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.349640 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wv5\" (UniqueName: \"kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.452455 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.453028 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.453054 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wv5\" (UniqueName: \"kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.453220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.453255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.453272 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.455123 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.460566 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.461187 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.465712 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.468085 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.475087 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wv5\" (UniqueName: \"kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5\") pod \"nova-api-0\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.476989 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.631014 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.661093 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:10 crc kubenswrapper[4778]: I0318 09:26:10.938927 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:10 crc kubenswrapper[4778]: W0318 09:26:10.948665 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7208979b_1773_4741_8dab_00c621897016.slice/crio-5094a9653f4ef25b7c722e95f4034590d8e493b9399c47f1c2173d365ce1a395 WatchSource:0}: Error finding container 5094a9653f4ef25b7c722e95f4034590d8e493b9399c47f1c2173d365ce1a395: Status 404 returned error can't find the container with id 5094a9653f4ef25b7c722e95f4034590d8e493b9399c47f1c2173d365ce1a395 Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.066421 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerStarted","Data":"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a"} Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.069605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerStarted","Data":"5094a9653f4ef25b7c722e95f4034590d8e493b9399c47f1c2173d365ce1a395"} Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.093134 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.414157 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-2nml6"] Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.416674 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.420033 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.420385 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.427904 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2nml6"] Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.486300 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.486504 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.486608 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kf8g\" (UniqueName: \"kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.486785 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.589240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.589335 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kf8g\" (UniqueName: \"kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.589413 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.589470 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.595168 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.596016 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.597521 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.608553 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kf8g\" (UniqueName: \"kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g\") pod \"nova-cell1-cell-mapping-2nml6\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:11 crc kubenswrapper[4778]: I0318 09:26:11.736495 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:12 crc kubenswrapper[4778]: I0318 09:26:12.100337 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerStarted","Data":"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f"} Mar 18 09:26:12 crc kubenswrapper[4778]: I0318 09:26:12.102816 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerStarted","Data":"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465"} Mar 18 09:26:12 crc kubenswrapper[4778]: I0318 09:26:12.135795 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.135772091 podStartE2EDuration="2.135772091s" podCreationTimestamp="2026-03-18 09:26:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:12.119913879 +0000 UTC m=+1438.694658719" watchObservedRunningTime="2026-03-18 09:26:12.135772091 +0000 UTC m=+1438.710516931" Mar 18 09:26:12 crc kubenswrapper[4778]: I0318 09:26:12.251932 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2nml6"] Mar 18 09:26:12 crc kubenswrapper[4778]: W0318 09:26:12.253436 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32eb800e_69e8_4e39_ae5b_74a5eec87b00.slice/crio-2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd WatchSource:0}: Error finding container 2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd: Status 404 returned error can't find the container with id 2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.128885 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerStarted","Data":"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323"} Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.131826 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.129591 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="proxy-httpd" containerID="cri-o://2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323" gracePeriod=30 Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.129687 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-notification-agent" containerID="cri-o://642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19" gracePeriod=30 Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.129707 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="sg-core" containerID="cri-o://778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a" gracePeriod=30 Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.129004 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-central-agent" containerID="cri-o://d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5" gracePeriod=30 Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.133983 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2nml6" event={"ID":"32eb800e-69e8-4e39-ae5b-74a5eec87b00","Type":"ContainerStarted","Data":"3858148ddf213daa44ce8f206664d3360023f6d6f91e24bcfce11a24c0f0213c"} Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.134053 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2nml6" event={"ID":"32eb800e-69e8-4e39-ae5b-74a5eec87b00","Type":"ContainerStarted","Data":"2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd"} Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.173733 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.456762556 podStartE2EDuration="6.173715654s" podCreationTimestamp="2026-03-18 09:26:07 +0000 UTC" firstStartedPulling="2026-03-18 09:26:08.131997361 +0000 UTC m=+1434.706742221" lastFinishedPulling="2026-03-18 09:26:11.848950469 +0000 UTC m=+1438.423695319" observedRunningTime="2026-03-18 09:26:13.173463578 +0000 UTC m=+1439.748208448" watchObservedRunningTime="2026-03-18 09:26:13.173715654 +0000 UTC m=+1439.748460504" Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.201746 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-2nml6" podStartSLOduration=2.201720697 podStartE2EDuration="2.201720697s" podCreationTimestamp="2026-03-18 09:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:13.198708795 +0000 UTC m=+1439.773453635" watchObservedRunningTime="2026-03-18 09:26:13.201720697 +0000 UTC m=+1439.776465547" Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.486471 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.557656 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:26:13 crc kubenswrapper[4778]: I0318 09:26:13.557955 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="dnsmasq-dns" containerID="cri-o://e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e" gracePeriod=10 Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.078317 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160763 4778 generic.go:334] "Generic (PLEG): container finished" podID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerID="2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323" exitCode=0 Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160809 4778 generic.go:334] "Generic (PLEG): container finished" podID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerID="778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a" exitCode=2 Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160820 4778 generic.go:334] "Generic (PLEG): container finished" podID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerID="642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19" exitCode=0 Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160874 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerDied","Data":"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323"} Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160915 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerDied","Data":"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a"} Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.160927 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerDied","Data":"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19"} Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.163650 4778 generic.go:334] "Generic (PLEG): container finished" podID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerID="e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e" exitCode=0 Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.165046 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.168077 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" event={"ID":"ecb86d82-de0e-474c-9942-a8dff1f8739b","Type":"ContainerDied","Data":"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e"} Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.168139 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-f44bz" event={"ID":"ecb86d82-de0e-474c-9942-a8dff1f8739b","Type":"ContainerDied","Data":"a9b184c2a1d038cecffbf8235c70616929fdbf84785e09acda62d2b9837c969e"} Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.168168 4778 scope.go:117] "RemoveContainer" containerID="e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.196340 4778 scope.go:117] "RemoveContainer" containerID="fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.226678 4778 scope.go:117] "RemoveContainer" containerID="e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e" Mar 18 09:26:14 crc kubenswrapper[4778]: E0318 09:26:14.227183 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e\": container with ID starting with e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e not found: ID does not exist" containerID="e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.227285 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e"} err="failed to get container status \"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e\": rpc error: code = NotFound desc = could not find container \"e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e\": container with ID starting with e327fc8583f7e5d02c80e676ea08de283f74931c1c3ee0bc49ee3a5a14aa646e not found: ID does not exist" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.227330 4778 scope.go:117] "RemoveContainer" containerID="fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6" Mar 18 09:26:14 crc kubenswrapper[4778]: E0318 09:26:14.228035 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6\": container with ID starting with fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6 not found: ID does not exist" containerID="fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.228109 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6"} err="failed to get container status \"fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6\": rpc error: code = NotFound desc = could not find container \"fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6\": container with ID starting with fa41e2a5d696686a307ab26d27ef48f09b474d38859031c88ee8c3c430a37be6 not found: ID does not exist" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.248176 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc\") pod \"ecb86d82-de0e-474c-9942-a8dff1f8739b\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.248238 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9vgw\" (UniqueName: \"kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw\") pod \"ecb86d82-de0e-474c-9942-a8dff1f8739b\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.248330 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb\") pod \"ecb86d82-de0e-474c-9942-a8dff1f8739b\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.248390 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb\") pod \"ecb86d82-de0e-474c-9942-a8dff1f8739b\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.248431 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config\") pod \"ecb86d82-de0e-474c-9942-a8dff1f8739b\" (UID: \"ecb86d82-de0e-474c-9942-a8dff1f8739b\") " Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.256729 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw" (OuterVolumeSpecName: "kube-api-access-s9vgw") pod "ecb86d82-de0e-474c-9942-a8dff1f8739b" (UID: "ecb86d82-de0e-474c-9942-a8dff1f8739b"). InnerVolumeSpecName "kube-api-access-s9vgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.301118 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ecb86d82-de0e-474c-9942-a8dff1f8739b" (UID: "ecb86d82-de0e-474c-9942-a8dff1f8739b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.301858 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ecb86d82-de0e-474c-9942-a8dff1f8739b" (UID: "ecb86d82-de0e-474c-9942-a8dff1f8739b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.313925 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ecb86d82-de0e-474c-9942-a8dff1f8739b" (UID: "ecb86d82-de0e-474c-9942-a8dff1f8739b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.316084 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config" (OuterVolumeSpecName: "config") pod "ecb86d82-de0e-474c-9942-a8dff1f8739b" (UID: "ecb86d82-de0e-474c-9942-a8dff1f8739b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.350811 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.350845 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.350854 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.350864 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecb86d82-de0e-474c-9942-a8dff1f8739b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.350877 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9vgw\" (UniqueName: \"kubernetes.io/projected/ecb86d82-de0e-474c-9942-a8dff1f8739b-kube-api-access-s9vgw\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.592916 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.604170 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-f44bz"] Mar 18 09:26:14 crc kubenswrapper[4778]: I0318 09:26:14.875850 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062687 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062764 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062860 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns6zx\" (UniqueName: \"kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062883 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062927 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062963 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.062993 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.063017 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs\") pod \"52df8ed1-aa17-446a-a3b4-e641f38a409d\" (UID: \"52df8ed1-aa17-446a-a3b4-e641f38a409d\") " Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.063452 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.064082 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.064242 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.072561 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts" (OuterVolumeSpecName: "scripts") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.073382 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx" (OuterVolumeSpecName: "kube-api-access-ns6zx") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "kube-api-access-ns6zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.105436 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.140087 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.165734 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns6zx\" (UniqueName: \"kubernetes.io/projected/52df8ed1-aa17-446a-a3b4-e641f38a409d-kube-api-access-ns6zx\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.165768 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52df8ed1-aa17-446a-a3b4-e641f38a409d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.165781 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.165793 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.165802 4778 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.183534 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.183892 4778 generic.go:334] "Generic (PLEG): container finished" podID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerID="d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5" exitCode=0 Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.183973 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.183971 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerDied","Data":"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5"} Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.184080 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52df8ed1-aa17-446a-a3b4-e641f38a409d","Type":"ContainerDied","Data":"ba77b580ef06a1e2bd00469a95b5561856fad5eff21c63e51dda25d025a61698"} Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.184103 4778 scope.go:117] "RemoveContainer" containerID="2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.203442 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data" (OuterVolumeSpecName: "config-data") pod "52df8ed1-aa17-446a-a3b4-e641f38a409d" (UID: "52df8ed1-aa17-446a-a3b4-e641f38a409d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.261873 4778 scope.go:117] "RemoveContainer" containerID="778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.266881 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.266907 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df8ed1-aa17-446a-a3b4-e641f38a409d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.300130 4778 scope.go:117] "RemoveContainer" containerID="642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.321877 4778 scope.go:117] "RemoveContainer" containerID="d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.357119 4778 scope.go:117] "RemoveContainer" containerID="2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.357650 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323\": container with ID starting with 2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323 not found: ID does not exist" containerID="2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.357695 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323"} err="failed to get container status \"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323\": rpc error: code = NotFound desc = could not find container \"2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323\": container with ID starting with 2d5739d6136b871e67f41a8e3625767fc50df0ba50a832acad58a42297906323 not found: ID does not exist" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.357721 4778 scope.go:117] "RemoveContainer" containerID="778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.358759 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a\": container with ID starting with 778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a not found: ID does not exist" containerID="778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.358799 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a"} err="failed to get container status \"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a\": rpc error: code = NotFound desc = could not find container \"778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a\": container with ID starting with 778ac28594ea0e0e844e24aa460d823a4f3e6192c87811af33d7b3196623b81a not found: ID does not exist" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.358857 4778 scope.go:117] "RemoveContainer" containerID="642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.359494 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19\": container with ID starting with 642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19 not found: ID does not exist" containerID="642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.359541 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19"} err="failed to get container status \"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19\": rpc error: code = NotFound desc = could not find container \"642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19\": container with ID starting with 642a24ad5daaace90981711dec3e1ced3af6f6187b91148b030cfbf7a5270c19 not found: ID does not exist" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.359591 4778 scope.go:117] "RemoveContainer" containerID="d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.361074 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5\": container with ID starting with d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5 not found: ID does not exist" containerID="d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.361103 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5"} err="failed to get container status \"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5\": rpc error: code = NotFound desc = could not find container \"d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5\": container with ID starting with d372433453b79164239a08fcdfdd47280b9a803ee4d62f2d4a73f469b67a17e5 not found: ID does not exist" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.534789 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.546306 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.564491 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.564994 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="init" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565014 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="init" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.565028 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="dnsmasq-dns" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565036 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="dnsmasq-dns" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.565049 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="proxy-httpd" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565054 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="proxy-httpd" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.565072 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="sg-core" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565078 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="sg-core" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.565090 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-central-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565096 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-central-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: E0318 09:26:15.565110 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-notification-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565118 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-notification-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565361 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-central-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565376 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="sg-core" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565388 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="ceilometer-notification-agent" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565396 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" containerName="proxy-httpd" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.565410 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" containerName="dnsmasq-dns" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.569779 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.574912 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.575068 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.575891 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.622329 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.673707 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.673822 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.673870 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7799h\" (UniqueName: \"kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.673944 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.674001 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.674301 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.674376 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.674535 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776094 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776171 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776218 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776247 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776269 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7799h\" (UniqueName: \"kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776295 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776318 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.776790 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.780370 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.781401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.782542 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.784283 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.784625 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.789473 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.800608 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7799h\" (UniqueName: \"kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h\") pod \"ceilometer-0\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " pod="openstack/ceilometer-0" Mar 18 09:26:15 crc kubenswrapper[4778]: I0318 09:26:15.909837 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:26:16 crc kubenswrapper[4778]: I0318 09:26:16.199604 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52df8ed1-aa17-446a-a3b4-e641f38a409d" path="/var/lib/kubelet/pods/52df8ed1-aa17-446a-a3b4-e641f38a409d/volumes" Mar 18 09:26:16 crc kubenswrapper[4778]: I0318 09:26:16.201279 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecb86d82-de0e-474c-9942-a8dff1f8739b" path="/var/lib/kubelet/pods/ecb86d82-de0e-474c-9942-a8dff1f8739b/volumes" Mar 18 09:26:16 crc kubenswrapper[4778]: I0318 09:26:16.397442 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:26:17 crc kubenswrapper[4778]: I0318 09:26:17.212299 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerStarted","Data":"72617aca4aa8400e7af4a60ff63ae0a42982c759ce5821a76119f425b2752b6a"} Mar 18 09:26:17 crc kubenswrapper[4778]: I0318 09:26:17.212382 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerStarted","Data":"3887f968114761a1654028b0a71448b233a4c1f413820ca59edf51160a07bebd"} Mar 18 09:26:18 crc kubenswrapper[4778]: I0318 09:26:18.223962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerStarted","Data":"d0ab691768395396ab9cd2b17a7818f8101a3f17ea12d03f3697e6b6b5c5a0b8"} Mar 18 09:26:18 crc kubenswrapper[4778]: I0318 09:26:18.226069 4778 generic.go:334] "Generic (PLEG): container finished" podID="32eb800e-69e8-4e39-ae5b-74a5eec87b00" containerID="3858148ddf213daa44ce8f206664d3360023f6d6f91e24bcfce11a24c0f0213c" exitCode=0 Mar 18 09:26:18 crc kubenswrapper[4778]: I0318 09:26:18.226110 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2nml6" event={"ID":"32eb800e-69e8-4e39-ae5b-74a5eec87b00","Type":"ContainerDied","Data":"3858148ddf213daa44ce8f206664d3360023f6d6f91e24bcfce11a24c0f0213c"} Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.308317 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerStarted","Data":"28ea8f018593b4e5ea693b13884510308b8fe6c9f56dc77f633042aa7b3c2142"} Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.662431 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.810016 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts\") pod \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.810442 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kf8g\" (UniqueName: \"kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g\") pod \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.810989 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle\") pod \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.811114 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data\") pod \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\" (UID: \"32eb800e-69e8-4e39-ae5b-74a5eec87b00\") " Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.817325 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts" (OuterVolumeSpecName: "scripts") pod "32eb800e-69e8-4e39-ae5b-74a5eec87b00" (UID: "32eb800e-69e8-4e39-ae5b-74a5eec87b00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.820424 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g" (OuterVolumeSpecName: "kube-api-access-7kf8g") pod "32eb800e-69e8-4e39-ae5b-74a5eec87b00" (UID: "32eb800e-69e8-4e39-ae5b-74a5eec87b00"). InnerVolumeSpecName "kube-api-access-7kf8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.844345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32eb800e-69e8-4e39-ae5b-74a5eec87b00" (UID: "32eb800e-69e8-4e39-ae5b-74a5eec87b00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.867408 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data" (OuterVolumeSpecName: "config-data") pod "32eb800e-69e8-4e39-ae5b-74a5eec87b00" (UID: "32eb800e-69e8-4e39-ae5b-74a5eec87b00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.916065 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kf8g\" (UniqueName: \"kubernetes.io/projected/32eb800e-69e8-4e39-ae5b-74a5eec87b00-kube-api-access-7kf8g\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.916122 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.916133 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:19 crc kubenswrapper[4778]: I0318 09:26:19.916144 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32eb800e-69e8-4e39-ae5b-74a5eec87b00-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.326063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2nml6" event={"ID":"32eb800e-69e8-4e39-ae5b-74a5eec87b00","Type":"ContainerDied","Data":"2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd"} Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.326502 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ebc375f292c2ec65bb43218ef25687c949a64e0554485c66d56f633736ab6dd" Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.326128 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2nml6" Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.329407 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerStarted","Data":"345ccf529d04d20baa9e9e06ea792464e3dc120ee1042052fe45190d2b838b33"} Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.329847 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.471028 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7986296149999998 podStartE2EDuration="5.470998417s" podCreationTimestamp="2026-03-18 09:26:15 +0000 UTC" firstStartedPulling="2026-03-18 09:26:16.402448062 +0000 UTC m=+1442.977192902" lastFinishedPulling="2026-03-18 09:26:20.074816864 +0000 UTC m=+1446.649561704" observedRunningTime="2026-03-18 09:26:20.35403479 +0000 UTC m=+1446.928779650" watchObservedRunningTime="2026-03-18 09:26:20.470998417 +0000 UTC m=+1447.045743257" Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.472150 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.472670 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-log" containerID="cri-o://91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" gracePeriod=30 Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.472777 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-api" containerID="cri-o://6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" gracePeriod=30 Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.522157 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.522566 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-log" containerID="cri-o://1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2" gracePeriod=30 Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.522716 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-metadata" containerID="cri-o://b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1" gracePeriod=30 Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.540272 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:20 crc kubenswrapper[4778]: I0318 09:26:20.540630 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerName="nova-scheduler-scheduler" containerID="cri-o://cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" gracePeriod=30 Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.047883 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.072750 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.077680 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.077752 4778 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerName="nova-scheduler-scheduler" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.278578 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349848 4778 generic.go:334] "Generic (PLEG): container finished" podID="7208979b-1773-4741-8dab-00c621897016" containerID="6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" exitCode=0 Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349879 4778 generic.go:334] "Generic (PLEG): container finished" podID="7208979b-1773-4741-8dab-00c621897016" containerID="91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" exitCode=143 Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349909 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerDied","Data":"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f"} Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349950 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349968 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerDied","Data":"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465"} Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.349983 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7208979b-1773-4741-8dab-00c621897016","Type":"ContainerDied","Data":"5094a9653f4ef25b7c722e95f4034590d8e493b9399c47f1c2173d365ce1a395"} Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.350005 4778 scope.go:117] "RemoveContainer" containerID="6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.351800 4778 generic.go:334] "Generic (PLEG): container finished" podID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerID="1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2" exitCode=143 Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.351946 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerDied","Data":"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2"} Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.372132 4778 scope.go:117] "RemoveContainer" containerID="91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.391379 4778 scope.go:117] "RemoveContainer" containerID="6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.392059 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f\": container with ID starting with 6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f not found: ID does not exist" containerID="6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.392092 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f"} err="failed to get container status \"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f\": rpc error: code = NotFound desc = could not find container \"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f\": container with ID starting with 6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f not found: ID does not exist" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.392115 4778 scope.go:117] "RemoveContainer" containerID="91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.392806 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465\": container with ID starting with 91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465 not found: ID does not exist" containerID="91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.392868 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465"} err="failed to get container status \"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465\": rpc error: code = NotFound desc = could not find container \"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465\": container with ID starting with 91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465 not found: ID does not exist" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.392910 4778 scope.go:117] "RemoveContainer" containerID="6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.393508 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f"} err="failed to get container status \"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f\": rpc error: code = NotFound desc = could not find container \"6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f\": container with ID starting with 6f9d20607700754788b81370a642cc7806a3f88b3f6d207f961897c9c95f7a7f not found: ID does not exist" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.393555 4778 scope.go:117] "RemoveContainer" containerID="91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.394005 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465"} err="failed to get container status \"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465\": rpc error: code = NotFound desc = could not find container \"91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465\": container with ID starting with 91e2269bb8dab77613ee947b5a18c795bfc1129b5f63274191efa669473cf465 not found: ID does not exist" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.450881 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.451003 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.451175 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.451256 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.451342 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.451429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72wv5\" (UniqueName: \"kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5\") pod \"7208979b-1773-4741-8dab-00c621897016\" (UID: \"7208979b-1773-4741-8dab-00c621897016\") " Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.452107 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs" (OuterVolumeSpecName: "logs") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.453038 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7208979b-1773-4741-8dab-00c621897016-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.457563 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5" (OuterVolumeSpecName: "kube-api-access-72wv5") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "kube-api-access-72wv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.480237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.486190 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data" (OuterVolumeSpecName: "config-data") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.516417 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.529799 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7208979b-1773-4741-8dab-00c621897016" (UID: "7208979b-1773-4741-8dab-00c621897016"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.554912 4778 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.554946 4778 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.554954 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.554964 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7208979b-1773-4741-8dab-00c621897016-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.554973 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72wv5\" (UniqueName: \"kubernetes.io/projected/7208979b-1773-4741-8dab-00c621897016-kube-api-access-72wv5\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.680552 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.695556 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.706984 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.707436 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-log" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707454 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-log" Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.707479 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32eb800e-69e8-4e39-ae5b-74a5eec87b00" containerName="nova-manage" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707487 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="32eb800e-69e8-4e39-ae5b-74a5eec87b00" containerName="nova-manage" Mar 18 09:26:21 crc kubenswrapper[4778]: E0318 09:26:21.707496 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-api" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707501 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-api" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707711 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-log" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707737 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="32eb800e-69e8-4e39-ae5b-74a5eec87b00" containerName="nova-manage" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.707752 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7208979b-1773-4741-8dab-00c621897016" containerName="nova-api-api" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.708846 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.711392 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.711493 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.711675 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.716035 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.871367 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.872346 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.872499 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a702c51-b7a6-4094-9d34-519102e1cf91-logs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.872600 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxhbn\" (UniqueName: \"kubernetes.io/projected/8a702c51-b7a6-4094-9d34-519102e1cf91-kube-api-access-lxhbn\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.872797 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-config-data\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.872926 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.974904 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.975455 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.975613 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.975753 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a702c51-b7a6-4094-9d34-519102e1cf91-logs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.975875 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxhbn\" (UniqueName: \"kubernetes.io/projected/8a702c51-b7a6-4094-9d34-519102e1cf91-kube-api-access-lxhbn\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.976024 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-config-data\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.976286 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a702c51-b7a6-4094-9d34-519102e1cf91-logs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.980126 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.981222 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-config-data\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.982018 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.989733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a702c51-b7a6-4094-9d34-519102e1cf91-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:21 crc kubenswrapper[4778]: I0318 09:26:21.997700 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxhbn\" (UniqueName: \"kubernetes.io/projected/8a702c51-b7a6-4094-9d34-519102e1cf91-kube-api-access-lxhbn\") pod \"nova-api-0\" (UID: \"8a702c51-b7a6-4094-9d34-519102e1cf91\") " pod="openstack/nova-api-0" Mar 18 09:26:22 crc kubenswrapper[4778]: I0318 09:26:22.077431 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:26:22 crc kubenswrapper[4778]: I0318 09:26:22.212116 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7208979b-1773-4741-8dab-00c621897016" path="/var/lib/kubelet/pods/7208979b-1773-4741-8dab-00c621897016/volumes" Mar 18 09:26:22 crc kubenswrapper[4778]: I0318 09:26:22.544760 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:26:22 crc kubenswrapper[4778]: W0318 09:26:22.553091 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a702c51_b7a6_4094_9d34_519102e1cf91.slice/crio-7715785cb5d43f547a06695d17c8835cd5212c03e6fa259322ada2a1dd76c24b WatchSource:0}: Error finding container 7715785cb5d43f547a06695d17c8835cd5212c03e6fa259322ada2a1dd76c24b: Status 404 returned error can't find the container with id 7715785cb5d43f547a06695d17c8835cd5212c03e6fa259322ada2a1dd76c24b Mar 18 09:26:23 crc kubenswrapper[4778]: I0318 09:26:23.382789 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a702c51-b7a6-4094-9d34-519102e1cf91","Type":"ContainerStarted","Data":"4cf481707b208cd4f8da4862b8ab7b0cc662181c8e30447acc8ec31cd4d1bfa4"} Mar 18 09:26:23 crc kubenswrapper[4778]: I0318 09:26:23.383181 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a702c51-b7a6-4094-9d34-519102e1cf91","Type":"ContainerStarted","Data":"5b977842c7b022e09540c13c0c1623d4faf93b8ba5a2f00023b87208c2df8aa4"} Mar 18 09:26:23 crc kubenswrapper[4778]: I0318 09:26:23.383220 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a702c51-b7a6-4094-9d34-519102e1cf91","Type":"ContainerStarted","Data":"7715785cb5d43f547a06695d17c8835cd5212c03e6fa259322ada2a1dd76c24b"} Mar 18 09:26:23 crc kubenswrapper[4778]: I0318 09:26:23.415652 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.415628426 podStartE2EDuration="2.415628426s" podCreationTimestamp="2026-03-18 09:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:23.403748841 +0000 UTC m=+1449.978493711" watchObservedRunningTime="2026-03-18 09:26:23.415628426 +0000 UTC m=+1449.990373276" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.152827 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.224803 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs\") pod \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.225311 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs" (OuterVolumeSpecName: "logs") pod "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" (UID: "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.225770 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ljtb\" (UniqueName: \"kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb\") pod \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.225851 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data\") pod \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.226354 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle\") pod \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.226410 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs\") pod \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\" (UID: \"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e\") " Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.227419 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.238036 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb" (OuterVolumeSpecName: "kube-api-access-7ljtb") pod "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" (UID: "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e"). InnerVolumeSpecName "kube-api-access-7ljtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.267760 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" (UID: "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.294881 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data" (OuterVolumeSpecName: "config-data") pod "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" (UID: "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.337133 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ljtb\" (UniqueName: \"kubernetes.io/projected/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-kube-api-access-7ljtb\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.337166 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.337176 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.373645 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" (UID: "2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.394428 4778 generic.go:334] "Generic (PLEG): container finished" podID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerID="b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1" exitCode=0 Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.395403 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.397379 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerDied","Data":"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1"} Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.397426 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e","Type":"ContainerDied","Data":"290b792dbc94b49540da6dec52821c0018cd2340a491c925285465d74334b24e"} Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.397448 4778 scope.go:117] "RemoveContainer" containerID="b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.432952 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.439035 4778 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.441377 4778 scope.go:117] "RemoveContainer" containerID="1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.455056 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.465677 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:24 crc kubenswrapper[4778]: E0318 09:26:24.466081 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-metadata" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.466106 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-metadata" Mar 18 09:26:24 crc kubenswrapper[4778]: E0318 09:26:24.466115 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-log" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.466121 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-log" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.466306 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-log" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.466321 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" containerName="nova-metadata-metadata" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.467251 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.473317 4778 scope.go:117] "RemoveContainer" containerID="b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.473734 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.473791 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 09:26:24 crc kubenswrapper[4778]: E0318 09:26:24.474207 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1\": container with ID starting with b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1 not found: ID does not exist" containerID="b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.474241 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1"} err="failed to get container status \"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1\": rpc error: code = NotFound desc = could not find container \"b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1\": container with ID starting with b5b91cc4190125f9b14c1253a9363421456fa8936df8639c9caa224025f25cf1 not found: ID does not exist" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.474260 4778 scope.go:117] "RemoveContainer" containerID="1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2" Mar 18 09:26:24 crc kubenswrapper[4778]: E0318 09:26:24.474918 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2\": container with ID starting with 1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2 not found: ID does not exist" containerID="1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.474941 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2"} err="failed to get container status \"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2\": rpc error: code = NotFound desc = could not find container \"1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2\": container with ID starting with 1dd5bf6bf4082a825b2076d19f6dfb792b399fc297c1934b2930da60d1c3b5a2 not found: ID does not exist" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.482320 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.642777 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-config-data\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.642862 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.642932 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx5lz\" (UniqueName: \"kubernetes.io/projected/28f01ca6-f7d2-4de3-9aa9-256803533b80-kube-api-access-xx5lz\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.642986 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.643049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f01ca6-f7d2-4de3-9aa9-256803533b80-logs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.744382 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-config-data\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.744439 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.744478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx5lz\" (UniqueName: \"kubernetes.io/projected/28f01ca6-f7d2-4de3-9aa9-256803533b80-kube-api-access-xx5lz\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.744507 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.744540 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f01ca6-f7d2-4de3-9aa9-256803533b80-logs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.745002 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f01ca6-f7d2-4de3-9aa9-256803533b80-logs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.749956 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-config-data\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.750471 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.751615 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f01ca6-f7d2-4de3-9aa9-256803533b80-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.764891 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx5lz\" (UniqueName: \"kubernetes.io/projected/28f01ca6-f7d2-4de3-9aa9-256803533b80-kube-api-access-xx5lz\") pod \"nova-metadata-0\" (UID: \"28f01ca6-f7d2-4de3-9aa9-256803533b80\") " pod="openstack/nova-metadata-0" Mar 18 09:26:24 crc kubenswrapper[4778]: I0318 09:26:24.796487 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.277283 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.356983 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl9dz\" (UniqueName: \"kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz\") pod \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.357071 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data\") pod \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.357135 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle\") pod \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\" (UID: \"dd2218f5-0310-4e4c-8edc-d13c25707ea5\") " Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.404826 4778 generic.go:334] "Generic (PLEG): container finished" podID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" exitCode=0 Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.404887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd2218f5-0310-4e4c-8edc-d13c25707ea5","Type":"ContainerDied","Data":"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413"} Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.404917 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd2218f5-0310-4e4c-8edc-d13c25707ea5","Type":"ContainerDied","Data":"b346fc3122696affb646e267887f25b4c0332b9b1cc46fa78a99f5366632ff21"} Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.404933 4778 scope.go:117] "RemoveContainer" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.404961 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.411642 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz" (OuterVolumeSpecName: "kube-api-access-vl9dz") pod "dd2218f5-0310-4e4c-8edc-d13c25707ea5" (UID: "dd2218f5-0310-4e4c-8edc-d13c25707ea5"). InnerVolumeSpecName "kube-api-access-vl9dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.416660 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data" (OuterVolumeSpecName: "config-data") pod "dd2218f5-0310-4e4c-8edc-d13c25707ea5" (UID: "dd2218f5-0310-4e4c-8edc-d13c25707ea5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.417005 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd2218f5-0310-4e4c-8edc-d13c25707ea5" (UID: "dd2218f5-0310-4e4c-8edc-d13c25707ea5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.435841 4778 scope.go:117] "RemoveContainer" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" Mar 18 09:26:25 crc kubenswrapper[4778]: E0318 09:26:25.442772 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413\": container with ID starting with cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413 not found: ID does not exist" containerID="cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.442877 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413"} err="failed to get container status \"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413\": rpc error: code = NotFound desc = could not find container \"cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413\": container with ID starting with cfc0918450fc9a50162d3d3d50000ac14e273b163d0cfed7ac25065871803413 not found: ID does not exist" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.470036 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.470092 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl9dz\" (UniqueName: \"kubernetes.io/projected/dd2218f5-0310-4e4c-8edc-d13c25707ea5-kube-api-access-vl9dz\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.470112 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2218f5-0310-4e4c-8edc-d13c25707ea5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.472639 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.767233 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.781792 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.791831 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:25 crc kubenswrapper[4778]: E0318 09:26:25.792404 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerName="nova-scheduler-scheduler" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.792432 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerName="nova-scheduler-scheduler" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.792624 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" containerName="nova-scheduler-scheduler" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.793563 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.796801 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.807047 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.878130 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-config-data\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.878486 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.878713 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpq4z\" (UniqueName: \"kubernetes.io/projected/9b1623d1-2084-419e-b36a-80930113a280-kube-api-access-mpq4z\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.981342 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.982075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpq4z\" (UniqueName: \"kubernetes.io/projected/9b1623d1-2084-419e-b36a-80930113a280-kube-api-access-mpq4z\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.982289 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-config-data\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.986516 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:25 crc kubenswrapper[4778]: I0318 09:26:25.987537 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1623d1-2084-419e-b36a-80930113a280-config-data\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.008914 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpq4z\" (UniqueName: \"kubernetes.io/projected/9b1623d1-2084-419e-b36a-80930113a280-kube-api-access-mpq4z\") pod \"nova-scheduler-0\" (UID: \"9b1623d1-2084-419e-b36a-80930113a280\") " pod="openstack/nova-scheduler-0" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.110371 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.209171 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e" path="/var/lib/kubelet/pods/2e0b68ca-5b25-4377-81f6-c4ccd5d7b11e/volumes" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.210353 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2218f5-0310-4e4c-8edc-d13c25707ea5" path="/var/lib/kubelet/pods/dd2218f5-0310-4e4c-8edc-d13c25707ea5/volumes" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.423221 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28f01ca6-f7d2-4de3-9aa9-256803533b80","Type":"ContainerStarted","Data":"cb52cc57268289cd695bd462776bcac79b728c35c7228ef7b1b59ad2d9aa1d33"} Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.423610 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28f01ca6-f7d2-4de3-9aa9-256803533b80","Type":"ContainerStarted","Data":"45f9eb07f3611369455207e143d7f684b92e8fb675c65c647a3e90225a551a0f"} Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.423623 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28f01ca6-f7d2-4de3-9aa9-256803533b80","Type":"ContainerStarted","Data":"2c451f78461244dc8809e7e9b61abbdcb51053b49e02275879d48a3bf93a2146"} Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.443752 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.443732519 podStartE2EDuration="2.443732519s" podCreationTimestamp="2026-03-18 09:26:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:26.441629371 +0000 UTC m=+1453.016374211" watchObservedRunningTime="2026-03-18 09:26:26.443732519 +0000 UTC m=+1453.018477359" Mar 18 09:26:26 crc kubenswrapper[4778]: I0318 09:26:26.629563 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:26:27 crc kubenswrapper[4778]: I0318 09:26:27.436161 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b1623d1-2084-419e-b36a-80930113a280","Type":"ContainerStarted","Data":"ac69936b9763ba949a97946cdc99dfe2182bce7d03dd02c1f5ec30e63347dd8e"} Mar 18 09:26:27 crc kubenswrapper[4778]: I0318 09:26:27.436731 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b1623d1-2084-419e-b36a-80930113a280","Type":"ContainerStarted","Data":"a5bf04c00c907bf04465611a4b915a3a63e88e8b73433bf7e23c0b7d7f8aebb9"} Mar 18 09:26:27 crc kubenswrapper[4778]: I0318 09:26:27.469005 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.4689710160000002 podStartE2EDuration="2.468971016s" podCreationTimestamp="2026-03-18 09:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:26:27.453574786 +0000 UTC m=+1454.028319636" watchObservedRunningTime="2026-03-18 09:26:27.468971016 +0000 UTC m=+1454.043715866" Mar 18 09:26:28 crc kubenswrapper[4778]: I0318 09:26:28.068347 4778 scope.go:117] "RemoveContainer" containerID="3c3567d850d5fbfcade4077c9139b7f651174e9261a4f7a1ab2f40e22fce3000" Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.147627 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.148112 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.148181 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.149367 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.149465 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853" gracePeriod=600 Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.473740 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853" exitCode=0 Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.473841 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853"} Mar 18 09:26:30 crc kubenswrapper[4778]: I0318 09:26:30.474121 4778 scope.go:117] "RemoveContainer" containerID="7154b13c7f4cd2402d0304ed7b86d22cac3e7b544f6222d5ad0d8d8ac0463fe2" Mar 18 09:26:31 crc kubenswrapper[4778]: I0318 09:26:31.111588 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 09:26:31 crc kubenswrapper[4778]: I0318 09:26:31.491173 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492"} Mar 18 09:26:32 crc kubenswrapper[4778]: I0318 09:26:32.078188 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:26:32 crc kubenswrapper[4778]: I0318 09:26:32.078555 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:26:33 crc kubenswrapper[4778]: I0318 09:26:33.099620 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a702c51-b7a6-4094-9d34-519102e1cf91" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:26:33 crc kubenswrapper[4778]: I0318 09:26:33.099622 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a702c51-b7a6-4094-9d34-519102e1cf91" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:26:34 crc kubenswrapper[4778]: I0318 09:26:34.797795 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 09:26:34 crc kubenswrapper[4778]: I0318 09:26:34.798335 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 09:26:35 crc kubenswrapper[4778]: I0318 09:26:35.810507 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="28f01ca6-f7d2-4de3-9aa9-256803533b80" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:26:35 crc kubenswrapper[4778]: I0318 09:26:35.810531 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="28f01ca6-f7d2-4de3-9aa9-256803533b80" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:26:36 crc kubenswrapper[4778]: I0318 09:26:36.111617 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 09:26:36 crc kubenswrapper[4778]: I0318 09:26:36.137800 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 09:26:36 crc kubenswrapper[4778]: I0318 09:26:36.585192 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 09:26:40 crc kubenswrapper[4778]: I0318 09:26:40.077725 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 09:26:40 crc kubenswrapper[4778]: I0318 09:26:40.078692 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.089769 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.092167 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.100599 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.621774 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.797403 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 09:26:42 crc kubenswrapper[4778]: I0318 09:26:42.797453 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 09:26:44 crc kubenswrapper[4778]: I0318 09:26:44.807359 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 09:26:44 crc kubenswrapper[4778]: I0318 09:26:44.809771 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 09:26:44 crc kubenswrapper[4778]: I0318 09:26:44.819428 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 09:26:45 crc kubenswrapper[4778]: I0318 09:26:45.655123 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 09:26:45 crc kubenswrapper[4778]: I0318 09:26:45.921825 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 09:26:54 crc kubenswrapper[4778]: I0318 09:26:54.409275 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:26:56 crc kubenswrapper[4778]: I0318 09:26:56.204422 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:26:58 crc kubenswrapper[4778]: I0318 09:26:58.500408 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="rabbitmq" containerID="cri-o://29a1b3eb3844657a811126fd222acecd2b3045c5c60889965b169deb2aec2c9a" gracePeriod=604796 Mar 18 09:27:00 crc kubenswrapper[4778]: I0318 09:27:00.078835 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 18 09:27:00 crc kubenswrapper[4778]: I0318 09:27:00.707224 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="rabbitmq" containerID="cri-o://4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48" gracePeriod=604796 Mar 18 09:27:04 crc kubenswrapper[4778]: I0318 09:27:04.882729 4778 generic.go:334] "Generic (PLEG): container finished" podID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerID="29a1b3eb3844657a811126fd222acecd2b3045c5c60889965b169deb2aec2c9a" exitCode=0 Mar 18 09:27:04 crc kubenswrapper[4778]: I0318 09:27:04.882840 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerDied","Data":"29a1b3eb3844657a811126fd222acecd2b3045c5c60889965b169deb2aec2c9a"} Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.148126 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173703 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173787 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173824 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173859 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173883 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.173925 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trc2k\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.174039 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.174114 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.174144 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.174177 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.174268 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf\") pod \"57955df9-f0c5-4cfc-91fd-135771be7ed2\" (UID: \"57955df9-f0c5-4cfc-91fd-135771be7ed2\") " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.184576 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.184905 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.185809 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.186449 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.187140 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.193173 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k" (OuterVolumeSpecName: "kube-api-access-trc2k") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "kube-api-access-trc2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.198028 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.204943 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info" (OuterVolumeSpecName: "pod-info") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.227323 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data" (OuterVolumeSpecName: "config-data") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276847 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276880 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276890 4778 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57955df9-f0c5-4cfc-91fd-135771be7ed2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276899 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276908 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276917 4778 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276926 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276935 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trc2k\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-kube-api-access-trc2k\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.276944 4778 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57955df9-f0c5-4cfc-91fd-135771be7ed2-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.279993 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf" (OuterVolumeSpecName: "server-conf") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.300945 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.366783 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "57955df9-f0c5-4cfc-91fd-135771be7ed2" (UID: "57955df9-f0c5-4cfc-91fd-135771be7ed2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.378512 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57955df9-f0c5-4cfc-91fd-135771be7ed2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.378543 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.378553 4778 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57955df9-f0c5-4cfc-91fd-135771be7ed2-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.894913 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"57955df9-f0c5-4cfc-91fd-135771be7ed2","Type":"ContainerDied","Data":"9908701ec8146f186284a44b30a5e2c02918471c220798a76b04c8de271c7850"} Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.895500 4778 scope.go:117] "RemoveContainer" containerID="29a1b3eb3844657a811126fd222acecd2b3045c5c60889965b169deb2aec2c9a" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.894956 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.942410 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.951782 4778 scope.go:117] "RemoveContainer" containerID="3698e124eba53a15e3f16dfe6346805545443e4b8ce94d12254d508326508979" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.952364 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.987786 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:27:05 crc kubenswrapper[4778]: E0318 09:27:05.988217 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="rabbitmq" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.988240 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="rabbitmq" Mar 18 09:27:05 crc kubenswrapper[4778]: E0318 09:27:05.988287 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="setup-container" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.988297 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="setup-container" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.988511 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" containerName="rabbitmq" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.989751 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.994289 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.994761 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.996144 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.996365 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.996144 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.998427 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-b2npt" Mar 18 09:27:05 crc kubenswrapper[4778]: I0318 09:27:05.998630 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.011767 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191471 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191519 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0191a745-1fe2-4a1c-b007-96525ad39787-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191609 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191648 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191676 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191697 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0191a745-1fe2-4a1c-b007-96525ad39787-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191714 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-config-data\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.191742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.192043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.192223 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v4n2\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-kube-api-access-9v4n2\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.201249 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57955df9-f0c5-4cfc-91fd-135771be7ed2" path="/var/lib/kubelet/pods/57955df9-f0c5-4cfc-91fd-135771be7ed2/volumes" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294549 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294607 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0191a745-1fe2-4a1c-b007-96525ad39787-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294645 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294674 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294711 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294736 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0191a745-1fe2-4a1c-b007-96525ad39787-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294759 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-config-data\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294795 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.294922 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v4n2\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-kube-api-access-9v4n2\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.295698 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.295945 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.296489 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-config-data\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.296690 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.296864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.297588 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0191a745-1fe2-4a1c-b007-96525ad39787-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.300268 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0191a745-1fe2-4a1c-b007-96525ad39787-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.304233 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.304527 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.306974 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0191a745-1fe2-4a1c-b007-96525ad39787-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.322216 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v4n2\" (UniqueName: \"kubernetes.io/projected/0191a745-1fe2-4a1c-b007-96525ad39787-kube-api-access-9v4n2\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.332258 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"0191a745-1fe2-4a1c-b007-96525ad39787\") " pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.358644 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 09:27:06 crc kubenswrapper[4778]: I0318 09:27:06.883714 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.258074 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.417781 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.417899 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.417946 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418008 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418028 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418049 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418071 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418106 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418146 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhk6k\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418175 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418191 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf\") pod \"671fe1be-f3dd-475e-8c48-a1d1db510aef\" (UID: \"671fe1be-f3dd-475e-8c48-a1d1db510aef\") " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.418909 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.420176 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.420691 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.424168 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.425781 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.425796 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.427331 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k" (OuterVolumeSpecName: "kube-api-access-mhk6k") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "kube-api-access-mhk6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.429439 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info" (OuterVolumeSpecName: "pod-info") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.463849 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data" (OuterVolumeSpecName: "config-data") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.496909 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf" (OuterVolumeSpecName: "server-conf") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520451 4778 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/671fe1be-f3dd-475e-8c48-a1d1db510aef-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520504 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520515 4778 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/671fe1be-f3dd-475e-8c48-a1d1db510aef-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520538 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520549 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520560 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520568 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520576 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhk6k\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-kube-api-access-mhk6k\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520587 4778 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.520596 4778 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/671fe1be-f3dd-475e-8c48-a1d1db510aef-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.540680 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.554779 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "671fe1be-f3dd-475e-8c48-a1d1db510aef" (UID: "671fe1be-f3dd-475e-8c48-a1d1db510aef"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.622565 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.622949 4778 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/671fe1be-f3dd-475e-8c48-a1d1db510aef-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.913214 4778 generic.go:334] "Generic (PLEG): container finished" podID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerID="4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48" exitCode=0 Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.913790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerDied","Data":"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48"} Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.913887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"671fe1be-f3dd-475e-8c48-a1d1db510aef","Type":"ContainerDied","Data":"fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23"} Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.913962 4778 scope.go:117] "RemoveContainer" containerID="4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.914132 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:07 crc kubenswrapper[4778]: I0318 09:27:07.933796 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0191a745-1fe2-4a1c-b007-96525ad39787","Type":"ContainerStarted","Data":"75903085f19cb6ef05c70fe9495d00471dd6c2b8a106f236b36f593571d1a81d"} Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.071586 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.083036 4778 scope.go:117] "RemoveContainer" containerID="fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.090861 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:27:08 crc kubenswrapper[4778]: E0318 09:27:08.107830 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671fe1be_f3dd_475e_8c48_a1d1db510aef.slice/crio-fd11d5ce698fa7411014a038329cc43022813b8f74df72dacf0e69cfc2473b23\": RecentStats: unable to find data in memory cache]" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.109845 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:27:08 crc kubenswrapper[4778]: E0318 09:27:08.110376 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="rabbitmq" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.110397 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="rabbitmq" Mar 18 09:27:08 crc kubenswrapper[4778]: E0318 09:27:08.110415 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="setup-container" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.110424 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="setup-container" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.110672 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" containerName="rabbitmq" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.111863 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.119944 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.120108 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.120451 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.120604 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.120763 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.120917 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.122221 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7f9jg" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.128990 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.197134 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671fe1be-f3dd-475e-8c48-a1d1db510aef" path="/var/lib/kubelet/pods/671fe1be-f3dd-475e-8c48-a1d1db510aef/volumes" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.204980 4778 scope.go:117] "RemoveContainer" containerID="4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48" Mar 18 09:27:08 crc kubenswrapper[4778]: E0318 09:27:08.205754 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48\": container with ID starting with 4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48 not found: ID does not exist" containerID="4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.205785 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48"} err="failed to get container status \"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48\": rpc error: code = NotFound desc = could not find container \"4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48\": container with ID starting with 4dbdd45745541c6f09d910104ce92d6e5576a73c255f6946dff2cb710ee0ef48 not found: ID does not exist" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.205805 4778 scope.go:117] "RemoveContainer" containerID="fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7" Mar 18 09:27:08 crc kubenswrapper[4778]: E0318 09:27:08.206085 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7\": container with ID starting with fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7 not found: ID does not exist" containerID="fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.206104 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7"} err="failed to get container status \"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7\": rpc error: code = NotFound desc = could not find container \"fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7\": container with ID starting with fe05be7f27af52f386341c802d98dc00dc2ec8f77ecc656619edc3c4420a9be7 not found: ID does not exist" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.237499 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.237563 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.237594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.237732 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f9428cb3-4fdf-4b01-9368-28b413ecf82f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.237964 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8f74\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-kube-api-access-n8f74\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238079 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238136 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238213 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238249 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f9428cb3-4fdf-4b01-9368-28b413ecf82f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.238274 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340192 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f9428cb3-4fdf-4b01-9368-28b413ecf82f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8f74\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-kube-api-access-n8f74\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340514 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340544 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340614 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340682 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340760 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f9428cb3-4fdf-4b01-9368-28b413ecf82f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340792 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340892 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.340955 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.341038 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.341527 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.341632 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.341766 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.342094 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.345161 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f9428cb3-4fdf-4b01-9368-28b413ecf82f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.346675 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f9428cb3-4fdf-4b01-9368-28b413ecf82f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.362360 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8f74\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-kube-api-access-n8f74\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.392698 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.392724 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f9428cb3-4fdf-4b01-9368-28b413ecf82f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.410104 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f9428cb3-4fdf-4b01-9368-28b413ecf82f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.438974 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f9428cb3-4fdf-4b01-9368-28b413ecf82f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.479269 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.944661 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0191a745-1fe2-4a1c-b007-96525ad39787","Type":"ContainerStarted","Data":"0396d45944ee78a263eb77715d857aabaf7376175957a0a8bd08d08293d88300"} Mar 18 09:27:08 crc kubenswrapper[4778]: I0318 09:27:08.962481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 09:27:08 crc kubenswrapper[4778]: W0318 09:27:08.972467 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9428cb3_4fdf_4b01_9368_28b413ecf82f.slice/crio-b3c9d4d88b999e63ea6962a213c1a22ebbce9fbc98a3be4a390c792ef9b67b0f WatchSource:0}: Error finding container b3c9d4d88b999e63ea6962a213c1a22ebbce9fbc98a3be4a390c792ef9b67b0f: Status 404 returned error can't find the container with id b3c9d4d88b999e63ea6962a213c1a22ebbce9fbc98a3be4a390c792ef9b67b0f Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.075759 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.077591 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.079360 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.100937 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158104 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158151 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158173 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbl7b\" (UniqueName: \"kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158279 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158316 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.158462 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.260304 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.260369 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.260415 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbl7b\" (UniqueName: \"kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.260537 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.261576 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.261618 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.261722 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.261752 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.262279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.262673 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.263549 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.289898 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbl7b\" (UniqueName: \"kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b\") pod \"dnsmasq-dns-578b8d767c-vmtdf\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.537958 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.957932 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f9428cb3-4fdf-4b01-9368-28b413ecf82f","Type":"ContainerStarted","Data":"b3c9d4d88b999e63ea6962a213c1a22ebbce9fbc98a3be4a390c792ef9b67b0f"} Mar 18 09:27:09 crc kubenswrapper[4778]: I0318 09:27:09.996466 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:10 crc kubenswrapper[4778]: W0318 09:27:10.099881 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod974eb9b0_6a47_42f5_a01d_4abc2c7ea9db.slice/crio-73dd95d61a24a7d79ef30ee5ee880b92ec53e03c716ada9313ef6fb1715ea360 WatchSource:0}: Error finding container 73dd95d61a24a7d79ef30ee5ee880b92ec53e03c716ada9313ef6fb1715ea360: Status 404 returned error can't find the container with id 73dd95d61a24a7d79ef30ee5ee880b92ec53e03c716ada9313ef6fb1715ea360 Mar 18 09:27:10 crc kubenswrapper[4778]: I0318 09:27:10.974979 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f9428cb3-4fdf-4b01-9368-28b413ecf82f","Type":"ContainerStarted","Data":"8500c305081e16a099a488523dbfb64e89d7b5eed8698351067cc8182c26be12"} Mar 18 09:27:10 crc kubenswrapper[4778]: I0318 09:27:10.977170 4778 generic.go:334] "Generic (PLEG): container finished" podID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerID="bb6fddc74fb77836587c6c361f18d4c91b25b5b0b236c261f677eca1ad0a9af5" exitCode=0 Mar 18 09:27:10 crc kubenswrapper[4778]: I0318 09:27:10.977244 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" event={"ID":"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db","Type":"ContainerDied","Data":"bb6fddc74fb77836587c6c361f18d4c91b25b5b0b236c261f677eca1ad0a9af5"} Mar 18 09:27:10 crc kubenswrapper[4778]: I0318 09:27:10.977283 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" event={"ID":"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db","Type":"ContainerStarted","Data":"73dd95d61a24a7d79ef30ee5ee880b92ec53e03c716ada9313ef6fb1715ea360"} Mar 18 09:27:11 crc kubenswrapper[4778]: I0318 09:27:11.993480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" event={"ID":"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db","Type":"ContainerStarted","Data":"0abbb14494f4856a4a551565083f0ffeae0b96885bc4209c98f8326f13f1e0ce"} Mar 18 09:27:12 crc kubenswrapper[4778]: I0318 09:27:12.046624 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" podStartSLOduration=3.046585733 podStartE2EDuration="3.046585733s" podCreationTimestamp="2026-03-18 09:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:27:12.019066753 +0000 UTC m=+1498.593811663" watchObservedRunningTime="2026-03-18 09:27:12.046585733 +0000 UTC m=+1498.621330613" Mar 18 09:27:13 crc kubenswrapper[4778]: I0318 09:27:13.003981 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.453650 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.456653 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.467503 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.559115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.559648 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.559697 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m898n\" (UniqueName: \"kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.661624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.662126 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.662271 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m898n\" (UniqueName: \"kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.662360 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.663018 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.686929 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m898n\" (UniqueName: \"kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n\") pod \"redhat-marketplace-qb7km\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:16 crc kubenswrapper[4778]: I0318 09:27:16.796655 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:17 crc kubenswrapper[4778]: I0318 09:27:17.289770 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:17 crc kubenswrapper[4778]: W0318 09:27:17.296138 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9acf58d_8699_44d0_8478_bec0c033dbd1.slice/crio-92d63e3e33616fc4f77cfb3a4148d6d69c68dfad28fb31cbacf24a70ff5a7286 WatchSource:0}: Error finding container 92d63e3e33616fc4f77cfb3a4148d6d69c68dfad28fb31cbacf24a70ff5a7286: Status 404 returned error can't find the container with id 92d63e3e33616fc4f77cfb3a4148d6d69c68dfad28fb31cbacf24a70ff5a7286 Mar 18 09:27:18 crc kubenswrapper[4778]: I0318 09:27:18.080352 4778 generic.go:334] "Generic (PLEG): container finished" podID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerID="335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55" exitCode=0 Mar 18 09:27:18 crc kubenswrapper[4778]: I0318 09:27:18.080475 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerDied","Data":"335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55"} Mar 18 09:27:18 crc kubenswrapper[4778]: I0318 09:27:18.080965 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerStarted","Data":"92d63e3e33616fc4f77cfb3a4148d6d69c68dfad28fb31cbacf24a70ff5a7286"} Mar 18 09:27:18 crc kubenswrapper[4778]: I0318 09:27:18.083897 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.090225 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerStarted","Data":"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1"} Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.540380 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.626885 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.627182 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="dnsmasq-dns" containerID="cri-o://76b7c61aa2a7be4fd8a7e4d8e9e84698c3d85d0b2b4f2c826eef7fac6ce91214" gracePeriod=10 Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.774556 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.776781 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.783042 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836569 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2gn6\" (UniqueName: \"kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836663 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836727 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836772 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836786 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.836803 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.939668 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.939723 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.939753 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.939915 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2gn6\" (UniqueName: \"kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.939999 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.940079 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.940781 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.940852 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.941041 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.941497 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.942039 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:19 crc kubenswrapper[4778]: I0318 09:27:19.968839 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2gn6\" (UniqueName: \"kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6\") pod \"dnsmasq-dns-fbc59fbb7-6qq8b\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.116611 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.168366 4778 generic.go:334] "Generic (PLEG): container finished" podID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerID="34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1" exitCode=0 Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.168456 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerDied","Data":"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1"} Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.188588 4778 generic.go:334] "Generic (PLEG): container finished" podID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerID="76b7c61aa2a7be4fd8a7e4d8e9e84698c3d85d0b2b4f2c826eef7fac6ce91214" exitCode=0 Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.188639 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" event={"ID":"31b962b8-c7e1-495c-a52d-f4fb63e884ca","Type":"ContainerDied","Data":"76b7c61aa2a7be4fd8a7e4d8e9e84698c3d85d0b2b4f2c826eef7fac6ce91214"} Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.290982 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.354610 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc\") pod \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.355171 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts5rq\" (UniqueName: \"kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq\") pod \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.355323 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb\") pod \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.355467 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb\") pod \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.355546 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config\") pod \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\" (UID: \"31b962b8-c7e1-495c-a52d-f4fb63e884ca\") " Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.363142 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq" (OuterVolumeSpecName: "kube-api-access-ts5rq") pod "31b962b8-c7e1-495c-a52d-f4fb63e884ca" (UID: "31b962b8-c7e1-495c-a52d-f4fb63e884ca"). InnerVolumeSpecName "kube-api-access-ts5rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.422905 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31b962b8-c7e1-495c-a52d-f4fb63e884ca" (UID: "31b962b8-c7e1-495c-a52d-f4fb63e884ca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.425455 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31b962b8-c7e1-495c-a52d-f4fb63e884ca" (UID: "31b962b8-c7e1-495c-a52d-f4fb63e884ca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.437702 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config" (OuterVolumeSpecName: "config") pod "31b962b8-c7e1-495c-a52d-f4fb63e884ca" (UID: "31b962b8-c7e1-495c-a52d-f4fb63e884ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.443821 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31b962b8-c7e1-495c-a52d-f4fb63e884ca" (UID: "31b962b8-c7e1-495c-a52d-f4fb63e884ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.458030 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.458062 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.458074 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.458085 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31b962b8-c7e1-495c-a52d-f4fb63e884ca-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.458096 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts5rq\" (UniqueName: \"kubernetes.io/projected/31b962b8-c7e1-495c-a52d-f4fb63e884ca-kube-api-access-ts5rq\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:20 crc kubenswrapper[4778]: I0318 09:27:20.743543 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.202763 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" event={"ID":"31b962b8-c7e1-495c-a52d-f4fb63e884ca","Type":"ContainerDied","Data":"f1b064a501bbcb63fd4dc86edbad3f121ced055deab0d7f76a0a08acbb649b7a"} Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.202811 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-hzxbf" Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.203327 4778 scope.go:117] "RemoveContainer" containerID="76b7c61aa2a7be4fd8a7e4d8e9e84698c3d85d0b2b4f2c826eef7fac6ce91214" Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.205060 4778 generic.go:334] "Generic (PLEG): container finished" podID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerID="959408f0646af1976d94cc538a0b42d120c9b802bebf63b681de404e7a6632a0" exitCode=0 Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.205123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" event={"ID":"5778826c-0b71-4dad-af9c-c7ec7f04aa36","Type":"ContainerDied","Data":"959408f0646af1976d94cc538a0b42d120c9b802bebf63b681de404e7a6632a0"} Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.206297 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" event={"ID":"5778826c-0b71-4dad-af9c-c7ec7f04aa36","Type":"ContainerStarted","Data":"c164a0a62f55fc2358e88475eedc2afcbd5a8934bf23dc30fb39c81fc6c685f2"} Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.209480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerStarted","Data":"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990"} Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.232951 4778 scope.go:117] "RemoveContainer" containerID="bda48f5cd3722d61bee57d8aacf12b0b775a391dd2ad3e1c8999cc5bb624e15c" Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.246042 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qb7km" podStartSLOduration=2.692884403 podStartE2EDuration="5.246025759s" podCreationTimestamp="2026-03-18 09:27:16 +0000 UTC" firstStartedPulling="2026-03-18 09:27:18.083537555 +0000 UTC m=+1504.658282395" lastFinishedPulling="2026-03-18 09:27:20.636678911 +0000 UTC m=+1507.211423751" observedRunningTime="2026-03-18 09:27:21.236228172 +0000 UTC m=+1507.810973012" watchObservedRunningTime="2026-03-18 09:27:21.246025759 +0000 UTC m=+1507.820770599" Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.443561 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:27:21 crc kubenswrapper[4778]: I0318 09:27:21.455501 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-hzxbf"] Mar 18 09:27:22 crc kubenswrapper[4778]: I0318 09:27:22.200056 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" path="/var/lib/kubelet/pods/31b962b8-c7e1-495c-a52d-f4fb63e884ca/volumes" Mar 18 09:27:22 crc kubenswrapper[4778]: I0318 09:27:22.223534 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" event={"ID":"5778826c-0b71-4dad-af9c-c7ec7f04aa36","Type":"ContainerStarted","Data":"fd53655c1355ee9e239be226351d5c5537c6593f18ffcbefe53a7ddaf1ea2816"} Mar 18 09:27:22 crc kubenswrapper[4778]: I0318 09:27:22.225337 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:22 crc kubenswrapper[4778]: I0318 09:27:22.280990 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" podStartSLOduration=3.28096625 podStartE2EDuration="3.28096625s" podCreationTimestamp="2026-03-18 09:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:27:22.24573298 +0000 UTC m=+1508.820477820" watchObservedRunningTime="2026-03-18 09:27:22.28096625 +0000 UTC m=+1508.855711080" Mar 18 09:27:26 crc kubenswrapper[4778]: I0318 09:27:26.797149 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:26 crc kubenswrapper[4778]: I0318 09:27:26.797737 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:26 crc kubenswrapper[4778]: I0318 09:27:26.857477 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:27 crc kubenswrapper[4778]: I0318 09:27:27.362112 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:27 crc kubenswrapper[4778]: I0318 09:27:27.426883 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.299796 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qb7km" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="registry-server" containerID="cri-o://3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990" gracePeriod=2 Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.841391 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.955267 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities\") pod \"c9acf58d-8699-44d0-8478-bec0c033dbd1\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.955386 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m898n\" (UniqueName: \"kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n\") pod \"c9acf58d-8699-44d0-8478-bec0c033dbd1\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.955476 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content\") pod \"c9acf58d-8699-44d0-8478-bec0c033dbd1\" (UID: \"c9acf58d-8699-44d0-8478-bec0c033dbd1\") " Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.956259 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities" (OuterVolumeSpecName: "utilities") pod "c9acf58d-8699-44d0-8478-bec0c033dbd1" (UID: "c9acf58d-8699-44d0-8478-bec0c033dbd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.959811 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n" (OuterVolumeSpecName: "kube-api-access-m898n") pod "c9acf58d-8699-44d0-8478-bec0c033dbd1" (UID: "c9acf58d-8699-44d0-8478-bec0c033dbd1"). InnerVolumeSpecName "kube-api-access-m898n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:29 crc kubenswrapper[4778]: I0318 09:27:29.992185 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9acf58d-8699-44d0-8478-bec0c033dbd1" (UID: "c9acf58d-8699-44d0-8478-bec0c033dbd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.057754 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.057807 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m898n\" (UniqueName: \"kubernetes.io/projected/c9acf58d-8699-44d0-8478-bec0c033dbd1-kube-api-access-m898n\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.057829 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9acf58d-8699-44d0-8478-bec0c033dbd1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.119942 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.185003 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.185282 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="dnsmasq-dns" containerID="cri-o://0abbb14494f4856a4a551565083f0ffeae0b96885bc4209c98f8326f13f1e0ce" gracePeriod=10 Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.330795 4778 generic.go:334] "Generic (PLEG): container finished" podID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerID="3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990" exitCode=0 Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.330922 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerDied","Data":"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990"} Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.330953 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb7km" event={"ID":"c9acf58d-8699-44d0-8478-bec0c033dbd1","Type":"ContainerDied","Data":"92d63e3e33616fc4f77cfb3a4148d6d69c68dfad28fb31cbacf24a70ff5a7286"} Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.330971 4778 scope.go:117] "RemoveContainer" containerID="3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.331136 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb7km" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.336895 4778 generic.go:334] "Generic (PLEG): container finished" podID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerID="0abbb14494f4856a4a551565083f0ffeae0b96885bc4209c98f8326f13f1e0ce" exitCode=0 Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.336957 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" event={"ID":"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db","Type":"ContainerDied","Data":"0abbb14494f4856a4a551565083f0ffeae0b96885bc4209c98f8326f13f1e0ce"} Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.375492 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.378494 4778 scope.go:117] "RemoveContainer" containerID="34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.397183 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb7km"] Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.422626 4778 scope.go:117] "RemoveContainer" containerID="335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.476478 4778 scope.go:117] "RemoveContainer" containerID="3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990" Mar 18 09:27:30 crc kubenswrapper[4778]: E0318 09:27:30.476976 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990\": container with ID starting with 3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990 not found: ID does not exist" containerID="3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.477005 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990"} err="failed to get container status \"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990\": rpc error: code = NotFound desc = could not find container \"3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990\": container with ID starting with 3c5a021a8aec905bc3d2c55fdfa238766e897c773d2c6b230128a84364ca4990 not found: ID does not exist" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.477025 4778 scope.go:117] "RemoveContainer" containerID="34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1" Mar 18 09:27:30 crc kubenswrapper[4778]: E0318 09:27:30.477272 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1\": container with ID starting with 34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1 not found: ID does not exist" containerID="34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.477289 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1"} err="failed to get container status \"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1\": rpc error: code = NotFound desc = could not find container \"34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1\": container with ID starting with 34c889e042c920633eb7f97988ef43ab5ee338a70cb4c91a758a40598693edc1 not found: ID does not exist" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.477301 4778 scope.go:117] "RemoveContainer" containerID="335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55" Mar 18 09:27:30 crc kubenswrapper[4778]: E0318 09:27:30.477542 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55\": container with ID starting with 335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55 not found: ID does not exist" containerID="335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.477563 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55"} err="failed to get container status \"335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55\": rpc error: code = NotFound desc = could not find container \"335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55\": container with ID starting with 335b59ee041cfb14be1175a1de597268e6dc30205d3c70b690bd7aed2b52ad55 not found: ID does not exist" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.730563 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.773841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.773891 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbl7b\" (UniqueName: \"kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.774044 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.774085 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.774125 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.774144 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam\") pod \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\" (UID: \"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db\") " Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.791469 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b" (OuterVolumeSpecName: "kube-api-access-dbl7b") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "kube-api-access-dbl7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.826685 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.835289 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.836007 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.853758 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.860322 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config" (OuterVolumeSpecName: "config") pod "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" (UID: "974eb9b0-6a47-42f5-a01d-4abc2c7ea9db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876020 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876057 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876071 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876082 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876093 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:30 crc kubenswrapper[4778]: I0318 09:27:30.876103 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbl7b\" (UniqueName: \"kubernetes.io/projected/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db-kube-api-access-dbl7b\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.348284 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" event={"ID":"974eb9b0-6a47-42f5-a01d-4abc2c7ea9db","Type":"ContainerDied","Data":"73dd95d61a24a7d79ef30ee5ee880b92ec53e03c716ada9313ef6fb1715ea360"} Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.348350 4778 scope.go:117] "RemoveContainer" containerID="0abbb14494f4856a4a551565083f0ffeae0b96885bc4209c98f8326f13f1e0ce" Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.348386 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-vmtdf" Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.369733 4778 scope.go:117] "RemoveContainer" containerID="bb6fddc74fb77836587c6c361f18d4c91b25b5b0b236c261f677eca1ad0a9af5" Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.395178 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:31 crc kubenswrapper[4778]: I0318 09:27:31.402777 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-vmtdf"] Mar 18 09:27:32 crc kubenswrapper[4778]: I0318 09:27:32.196905 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" path="/var/lib/kubelet/pods/974eb9b0-6a47-42f5-a01d-4abc2c7ea9db/volumes" Mar 18 09:27:32 crc kubenswrapper[4778]: I0318 09:27:32.197776 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" path="/var/lib/kubelet/pods/c9acf58d-8699-44d0-8478-bec0c033dbd1/volumes" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.197694 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p"] Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198672 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="extract-utilities" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198686 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="extract-utilities" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198699 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="extract-content" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198706 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="extract-content" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198722 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="init" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198728 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="init" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198747 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="registry-server" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198752 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="registry-server" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198766 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="init" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198772 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="init" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198782 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198788 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: E0318 09:27:40.198803 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198809 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.198977 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9acf58d-8699-44d0-8478-bec0c033dbd1" containerName="registry-server" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.199007 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="974eb9b0-6a47-42f5-a01d-4abc2c7ea9db" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.199018 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="31b962b8-c7e1-495c-a52d-f4fb63e884ca" containerName="dnsmasq-dns" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.199681 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.201766 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.203054 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.203455 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.204442 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.210519 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p"] Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.304815 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqc9n\" (UniqueName: \"kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.304889 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.305303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.305581 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.407654 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.407744 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.407844 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqc9n\" (UniqueName: \"kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.407889 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.416867 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.416886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.417942 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.432607 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqc9n\" (UniqueName: \"kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:40 crc kubenswrapper[4778]: I0318 09:27:40.538116 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:27:41 crc kubenswrapper[4778]: I0318 09:27:41.136745 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p"] Mar 18 09:27:41 crc kubenswrapper[4778]: I0318 09:27:41.465547 4778 generic.go:334] "Generic (PLEG): container finished" podID="0191a745-1fe2-4a1c-b007-96525ad39787" containerID="0396d45944ee78a263eb77715d857aabaf7376175957a0a8bd08d08293d88300" exitCode=0 Mar 18 09:27:41 crc kubenswrapper[4778]: I0318 09:27:41.465624 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0191a745-1fe2-4a1c-b007-96525ad39787","Type":"ContainerDied","Data":"0396d45944ee78a263eb77715d857aabaf7376175957a0a8bd08d08293d88300"} Mar 18 09:27:41 crc kubenswrapper[4778]: I0318 09:27:41.467737 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" event={"ID":"154a89df-1c2e-4f86-bbf3-827d6443c04a","Type":"ContainerStarted","Data":"a6279dd667e196403d3c5c22ceabeb3e0818bd4665ef38b4085c12358feff855"} Mar 18 09:27:43 crc kubenswrapper[4778]: I0318 09:27:42.479366 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0191a745-1fe2-4a1c-b007-96525ad39787","Type":"ContainerStarted","Data":"f3b54915bc8b60b3c34537d45c4915fceb3c19874748837f3a8ee823c2999484"} Mar 18 09:27:43 crc kubenswrapper[4778]: I0318 09:27:42.481109 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 09:27:43 crc kubenswrapper[4778]: I0318 09:27:42.526479 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.526457451 podStartE2EDuration="37.526457451s" podCreationTimestamp="2026-03-18 09:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:27:42.51690295 +0000 UTC m=+1529.091647810" watchObservedRunningTime="2026-03-18 09:27:42.526457451 +0000 UTC m=+1529.101202291" Mar 18 09:27:44 crc kubenswrapper[4778]: I0318 09:27:44.501022 4778 generic.go:334] "Generic (PLEG): container finished" podID="f9428cb3-4fdf-4b01-9368-28b413ecf82f" containerID="8500c305081e16a099a488523dbfb64e89d7b5eed8698351067cc8182c26be12" exitCode=0 Mar 18 09:27:44 crc kubenswrapper[4778]: I0318 09:27:44.501090 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f9428cb3-4fdf-4b01-9368-28b413ecf82f","Type":"ContainerDied","Data":"8500c305081e16a099a488523dbfb64e89d7b5eed8698351067cc8182c26be12"} Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.591128 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f9428cb3-4fdf-4b01-9368-28b413ecf82f","Type":"ContainerStarted","Data":"326d191845db7deb2003a53c8c81cda883199fe6e269581c5a42595124899cb3"} Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.594218 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.595806 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" event={"ID":"154a89df-1c2e-4f86-bbf3-827d6443c04a","Type":"ContainerStarted","Data":"a0e4df9a818fec5131c555c760ba72656483292d6411393207bbb36928547cc0"} Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.630346 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.630330972 podStartE2EDuration="43.630330972s" podCreationTimestamp="2026-03-18 09:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:27:51.628709569 +0000 UTC m=+1538.203454419" watchObservedRunningTime="2026-03-18 09:27:51.630330972 +0000 UTC m=+1538.205075812" Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.670832 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" podStartSLOduration=1.858760423 podStartE2EDuration="11.670807335s" podCreationTimestamp="2026-03-18 09:27:40 +0000 UTC" firstStartedPulling="2026-03-18 09:27:41.160726069 +0000 UTC m=+1527.735470919" lastFinishedPulling="2026-03-18 09:27:50.972772991 +0000 UTC m=+1537.547517831" observedRunningTime="2026-03-18 09:27:51.662719295 +0000 UTC m=+1538.237464145" watchObservedRunningTime="2026-03-18 09:27:51.670807335 +0000 UTC m=+1538.245552175" Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.875230 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.877586 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:51 crc kubenswrapper[4778]: I0318 09:27:51.893977 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.044805 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c267m\" (UniqueName: \"kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.044861 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.044888 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.146277 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c267m\" (UniqueName: \"kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.146342 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.146375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.147001 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.147551 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.167075 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c267m\" (UniqueName: \"kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m\") pod \"community-operators-6rqfc\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.240331 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:27:52 crc kubenswrapper[4778]: I0318 09:27:52.772220 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:27:53 crc kubenswrapper[4778]: I0318 09:27:53.615072 4778 generic.go:334] "Generic (PLEG): container finished" podID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerID="368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62" exitCode=0 Mar 18 09:27:53 crc kubenswrapper[4778]: I0318 09:27:53.615422 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerDied","Data":"368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62"} Mar 18 09:27:53 crc kubenswrapper[4778]: I0318 09:27:53.615461 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerStarted","Data":"90e09b0c12b6393363e62abca4fa79678465054f2984b57ee714525b9a8b228d"} Mar 18 09:27:55 crc kubenswrapper[4778]: I0318 09:27:55.650260 4778 generic.go:334] "Generic (PLEG): container finished" podID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerID="04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6" exitCode=0 Mar 18 09:27:55 crc kubenswrapper[4778]: I0318 09:27:55.650571 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerDied","Data":"04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6"} Mar 18 09:27:56 crc kubenswrapper[4778]: I0318 09:27:56.363439 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 09:27:56 crc kubenswrapper[4778]: I0318 09:27:56.660752 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerStarted","Data":"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e"} Mar 18 09:27:56 crc kubenswrapper[4778]: I0318 09:27:56.708524 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6rqfc" podStartSLOduration=3.212776716 podStartE2EDuration="5.708504548s" podCreationTimestamp="2026-03-18 09:27:51 +0000 UTC" firstStartedPulling="2026-03-18 09:27:53.617263365 +0000 UTC m=+1540.192008215" lastFinishedPulling="2026-03-18 09:27:56.112991207 +0000 UTC m=+1542.687736047" observedRunningTime="2026-03-18 09:27:56.706075402 +0000 UTC m=+1543.280820242" watchObservedRunningTime="2026-03-18 09:27:56.708504548 +0000 UTC m=+1543.283249388" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.157880 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563768-4z27c"] Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.162896 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.164775 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.164999 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.166991 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.170035 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-4z27c"] Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.296100 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht6bc\" (UniqueName: \"kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc\") pod \"auto-csr-approver-29563768-4z27c\" (UID: \"ec16e337-91fc-40c5-b3d4-87b5243e5a73\") " pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.398824 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht6bc\" (UniqueName: \"kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc\") pod \"auto-csr-approver-29563768-4z27c\" (UID: \"ec16e337-91fc-40c5-b3d4-87b5243e5a73\") " pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.420856 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht6bc\" (UniqueName: \"kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc\") pod \"auto-csr-approver-29563768-4z27c\" (UID: \"ec16e337-91fc-40c5-b3d4-87b5243e5a73\") " pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.484086 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:00 crc kubenswrapper[4778]: W0318 09:28:00.975250 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec16e337_91fc_40c5_b3d4_87b5243e5a73.slice/crio-11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016 WatchSource:0}: Error finding container 11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016: Status 404 returned error can't find the container with id 11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016 Mar 18 09:28:00 crc kubenswrapper[4778]: I0318 09:28:00.982014 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-4z27c"] Mar 18 09:28:01 crc kubenswrapper[4778]: I0318 09:28:01.711495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563768-4z27c" event={"ID":"ec16e337-91fc-40c5-b3d4-87b5243e5a73","Type":"ContainerStarted","Data":"11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016"} Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.241430 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.241842 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.721350 4778 generic.go:334] "Generic (PLEG): container finished" podID="154a89df-1c2e-4f86-bbf3-827d6443c04a" containerID="a0e4df9a818fec5131c555c760ba72656483292d6411393207bbb36928547cc0" exitCode=0 Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.721398 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" event={"ID":"154a89df-1c2e-4f86-bbf3-827d6443c04a","Type":"ContainerDied","Data":"a0e4df9a818fec5131c555c760ba72656483292d6411393207bbb36928547cc0"} Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.723005 4778 generic.go:334] "Generic (PLEG): container finished" podID="ec16e337-91fc-40c5-b3d4-87b5243e5a73" containerID="201dd8b3293289bdbf9f29c3749f98499b07694d8d80e9df99ed62c3075ec93f" exitCode=0 Mar 18 09:28:02 crc kubenswrapper[4778]: I0318 09:28:02.723035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563768-4z27c" event={"ID":"ec16e337-91fc-40c5-b3d4-87b5243e5a73","Type":"ContainerDied","Data":"201dd8b3293289bdbf9f29c3749f98499b07694d8d80e9df99ed62c3075ec93f"} Mar 18 09:28:03 crc kubenswrapper[4778]: I0318 09:28:03.300192 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6rqfc" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="registry-server" probeResult="failure" output=< Mar 18 09:28:03 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:28:03 crc kubenswrapper[4778]: > Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.085589 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.181729 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht6bc\" (UniqueName: \"kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc\") pod \"ec16e337-91fc-40c5-b3d4-87b5243e5a73\" (UID: \"ec16e337-91fc-40c5-b3d4-87b5243e5a73\") " Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.188164 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc" (OuterVolumeSpecName: "kube-api-access-ht6bc") pod "ec16e337-91fc-40c5-b3d4-87b5243e5a73" (UID: "ec16e337-91fc-40c5-b3d4-87b5243e5a73"). InnerVolumeSpecName "kube-api-access-ht6bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.282349 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.283745 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht6bc\" (UniqueName: \"kubernetes.io/projected/ec16e337-91fc-40c5-b3d4-87b5243e5a73-kube-api-access-ht6bc\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.385494 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqc9n\" (UniqueName: \"kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n\") pod \"154a89df-1c2e-4f86-bbf3-827d6443c04a\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.385778 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle\") pod \"154a89df-1c2e-4f86-bbf3-827d6443c04a\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.385858 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam\") pod \"154a89df-1c2e-4f86-bbf3-827d6443c04a\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.385973 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory\") pod \"154a89df-1c2e-4f86-bbf3-827d6443c04a\" (UID: \"154a89df-1c2e-4f86-bbf3-827d6443c04a\") " Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.389395 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n" (OuterVolumeSpecName: "kube-api-access-zqc9n") pod "154a89df-1c2e-4f86-bbf3-827d6443c04a" (UID: "154a89df-1c2e-4f86-bbf3-827d6443c04a"). InnerVolumeSpecName "kube-api-access-zqc9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.390168 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "154a89df-1c2e-4f86-bbf3-827d6443c04a" (UID: "154a89df-1c2e-4f86-bbf3-827d6443c04a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.409540 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "154a89df-1c2e-4f86-bbf3-827d6443c04a" (UID: "154a89df-1c2e-4f86-bbf3-827d6443c04a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.415954 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory" (OuterVolumeSpecName: "inventory") pod "154a89df-1c2e-4f86-bbf3-827d6443c04a" (UID: "154a89df-1c2e-4f86-bbf3-827d6443c04a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.488713 4778 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.488750 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.488764 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/154a89df-1c2e-4f86-bbf3-827d6443c04a-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.488776 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqc9n\" (UniqueName: \"kubernetes.io/projected/154a89df-1c2e-4f86-bbf3-827d6443c04a-kube-api-access-zqc9n\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.743004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563768-4z27c" event={"ID":"ec16e337-91fc-40c5-b3d4-87b5243e5a73","Type":"ContainerDied","Data":"11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016"} Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.743046 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-4z27c" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.743065 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11eb68ad70b16e075622c19b845c0f9b2eb54e4b6f5459e3950e9528763c3016" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.745377 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" event={"ID":"154a89df-1c2e-4f86-bbf3-827d6443c04a","Type":"ContainerDied","Data":"a6279dd667e196403d3c5c22ceabeb3e0818bd4665ef38b4085c12358feff855"} Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.745431 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6279dd667e196403d3c5c22ceabeb3e0818bd4665ef38b4085c12358feff855" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.745462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.857190 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z"] Mar 18 09:28:04 crc kubenswrapper[4778]: E0318 09:28:04.857693 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154a89df-1c2e-4f86-bbf3-827d6443c04a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.857716 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="154a89df-1c2e-4f86-bbf3-827d6443c04a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:28:04 crc kubenswrapper[4778]: E0318 09:28:04.857742 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec16e337-91fc-40c5-b3d4-87b5243e5a73" containerName="oc" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.857750 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec16e337-91fc-40c5-b3d4-87b5243e5a73" containerName="oc" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.857950 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec16e337-91fc-40c5-b3d4-87b5243e5a73" containerName="oc" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.857981 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="154a89df-1c2e-4f86-bbf3-827d6443c04a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.858716 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.862502 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.862709 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.863008 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.863337 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.885489 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z"] Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.912485 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.912611 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.912645 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:04 crc kubenswrapper[4778]: I0318 09:28:04.912728 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbscl\" (UniqueName: \"kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.014768 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbscl\" (UniqueName: \"kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.014898 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.014938 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.014957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.018871 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.018969 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.022951 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.038134 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbscl\" (UniqueName: \"kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.162816 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-nk868"] Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.170812 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-nk868"] Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.176996 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.730699 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z"] Mar 18 09:28:05 crc kubenswrapper[4778]: W0318 09:28:05.739066 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb989f767_d1ba_49fe_aebb_6aef120e0e22.slice/crio-b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa WatchSource:0}: Error finding container b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa: Status 404 returned error can't find the container with id b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa Mar 18 09:28:05 crc kubenswrapper[4778]: I0318 09:28:05.757553 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" event={"ID":"b989f767-d1ba-49fe-aebb-6aef120e0e22","Type":"ContainerStarted","Data":"b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa"} Mar 18 09:28:06 crc kubenswrapper[4778]: I0318 09:28:06.197439 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bcf145e-ae6a-4674-9f79-b6486ec2fa9d" path="/var/lib/kubelet/pods/7bcf145e-ae6a-4674-9f79-b6486ec2fa9d/volumes" Mar 18 09:28:06 crc kubenswrapper[4778]: I0318 09:28:06.770140 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" event={"ID":"b989f767-d1ba-49fe-aebb-6aef120e0e22","Type":"ContainerStarted","Data":"df748c22cbd9cfd719213cf439a446ed8f2c405ec832bdc5f38d5aacebbce9a9"} Mar 18 09:28:06 crc kubenswrapper[4778]: I0318 09:28:06.796372 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" podStartSLOduration=2.225744699 podStartE2EDuration="2.796350902s" podCreationTimestamp="2026-03-18 09:28:04 +0000 UTC" firstStartedPulling="2026-03-18 09:28:05.743799052 +0000 UTC m=+1552.318543932" lastFinishedPulling="2026-03-18 09:28:06.314405295 +0000 UTC m=+1552.889150135" observedRunningTime="2026-03-18 09:28:06.794311347 +0000 UTC m=+1553.369056207" watchObservedRunningTime="2026-03-18 09:28:06.796350902 +0000 UTC m=+1553.371095742" Mar 18 09:28:08 crc kubenswrapper[4778]: I0318 09:28:08.483440 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 09:28:12 crc kubenswrapper[4778]: I0318 09:28:12.314860 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:12 crc kubenswrapper[4778]: I0318 09:28:12.388703 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:12 crc kubenswrapper[4778]: I0318 09:28:12.570725 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:28:13 crc kubenswrapper[4778]: I0318 09:28:13.842372 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6rqfc" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="registry-server" containerID="cri-o://10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e" gracePeriod=2 Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.328798 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.434757 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content\") pod \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.434899 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c267m\" (UniqueName: \"kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m\") pod \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.434986 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities\") pod \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\" (UID: \"60a043ee-4047-4c6e-9c4d-eaa8272648f6\") " Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.441390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities" (OuterVolumeSpecName: "utilities") pod "60a043ee-4047-4c6e-9c4d-eaa8272648f6" (UID: "60a043ee-4047-4c6e-9c4d-eaa8272648f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.441700 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m" (OuterVolumeSpecName: "kube-api-access-c267m") pod "60a043ee-4047-4c6e-9c4d-eaa8272648f6" (UID: "60a043ee-4047-4c6e-9c4d-eaa8272648f6"). InnerVolumeSpecName "kube-api-access-c267m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.521978 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60a043ee-4047-4c6e-9c4d-eaa8272648f6" (UID: "60a043ee-4047-4c6e-9c4d-eaa8272648f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.536770 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c267m\" (UniqueName: \"kubernetes.io/projected/60a043ee-4047-4c6e-9c4d-eaa8272648f6-kube-api-access-c267m\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.536811 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.536823 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a043ee-4047-4c6e-9c4d-eaa8272648f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.857603 4778 generic.go:334] "Generic (PLEG): container finished" podID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerID="10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e" exitCode=0 Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.857725 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerDied","Data":"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e"} Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.857755 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rqfc" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.857791 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rqfc" event={"ID":"60a043ee-4047-4c6e-9c4d-eaa8272648f6","Type":"ContainerDied","Data":"90e09b0c12b6393363e62abca4fa79678465054f2984b57ee714525b9a8b228d"} Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.857814 4778 scope.go:117] "RemoveContainer" containerID="10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.886651 4778 scope.go:117] "RemoveContainer" containerID="04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.906410 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.918014 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6rqfc"] Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.941978 4778 scope.go:117] "RemoveContainer" containerID="368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.965891 4778 scope.go:117] "RemoveContainer" containerID="10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e" Mar 18 09:28:14 crc kubenswrapper[4778]: E0318 09:28:14.966284 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e\": container with ID starting with 10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e not found: ID does not exist" containerID="10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.966309 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e"} err="failed to get container status \"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e\": rpc error: code = NotFound desc = could not find container \"10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e\": container with ID starting with 10bc9187c58411f8bd503ceac23a1b877a1087233cc5e913165b72b8b08c718e not found: ID does not exist" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.966327 4778 scope.go:117] "RemoveContainer" containerID="04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6" Mar 18 09:28:14 crc kubenswrapper[4778]: E0318 09:28:14.966885 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6\": container with ID starting with 04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6 not found: ID does not exist" containerID="04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.967048 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6"} err="failed to get container status \"04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6\": rpc error: code = NotFound desc = could not find container \"04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6\": container with ID starting with 04472b35671fac6b11bde92fbb0f35c02a6fd97b862b7456f7bbd8d152a69ad6 not found: ID does not exist" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.967113 4778 scope.go:117] "RemoveContainer" containerID="368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62" Mar 18 09:28:14 crc kubenswrapper[4778]: E0318 09:28:14.967417 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62\": container with ID starting with 368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62 not found: ID does not exist" containerID="368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62" Mar 18 09:28:14 crc kubenswrapper[4778]: I0318 09:28:14.967440 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62"} err="failed to get container status \"368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62\": rpc error: code = NotFound desc = could not find container \"368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62\": container with ID starting with 368780356a6ea029a190f79b18f4dc98d05a779ebb542e11a3d9ae9ec1164a62 not found: ID does not exist" Mar 18 09:28:16 crc kubenswrapper[4778]: I0318 09:28:16.203086 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" path="/var/lib/kubelet/pods/60a043ee-4047-4c6e-9c4d-eaa8272648f6/volumes" Mar 18 09:28:28 crc kubenswrapper[4778]: I0318 09:28:28.549713 4778 scope.go:117] "RemoveContainer" containerID="d4dc8e8b710d4699b5bf32fb7126e48abd2e3887409f25b1985152911c6485a4" Mar 18 09:28:28 crc kubenswrapper[4778]: I0318 09:28:28.577413 4778 scope.go:117] "RemoveContainer" containerID="1fd591e2b660a21c69fd30f836286a21f17c533f4da08c01dd7acbea44c0d5f9" Mar 18 09:28:28 crc kubenswrapper[4778]: I0318 09:28:28.647241 4778 scope.go:117] "RemoveContainer" containerID="a91ae12327b49c9ef11425b6654b97316a81c22c90fc2a91ca50368382d18bdb" Mar 18 09:28:28 crc kubenswrapper[4778]: I0318 09:28:28.723821 4778 scope.go:117] "RemoveContainer" containerID="6d2d7a8a11e12ed4124ccde5834d93c0bc78bbc0a9c88b791847bb709dbbb116" Mar 18 09:28:30 crc kubenswrapper[4778]: I0318 09:28:30.147867 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:28:30 crc kubenswrapper[4778]: I0318 09:28:30.148414 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:29:00 crc kubenswrapper[4778]: I0318 09:29:00.147549 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:29:00 crc kubenswrapper[4778]: I0318 09:29:00.148254 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.046931 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:04 crc kubenswrapper[4778]: E0318 09:29:04.048673 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="extract-utilities" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.048701 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="extract-utilities" Mar 18 09:29:04 crc kubenswrapper[4778]: E0318 09:29:04.048741 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="extract-content" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.048753 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="extract-content" Mar 18 09:29:04 crc kubenswrapper[4778]: E0318 09:29:04.048775 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="registry-server" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.048786 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="registry-server" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.049062 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a043ee-4047-4c6e-9c4d-eaa8272648f6" containerName="registry-server" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.051086 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.077509 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.156680 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.156724 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md686\" (UniqueName: \"kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.156758 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.258283 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.258322 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md686\" (UniqueName: \"kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.258350 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.258792 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.259091 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.278824 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md686\" (UniqueName: \"kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686\") pod \"redhat-operators-tpgxl\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.391013 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:04 crc kubenswrapper[4778]: I0318 09:29:04.867903 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:05 crc kubenswrapper[4778]: I0318 09:29:05.448923 4778 generic.go:334] "Generic (PLEG): container finished" podID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerID="cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc" exitCode=0 Mar 18 09:29:05 crc kubenswrapper[4778]: I0318 09:29:05.449050 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerDied","Data":"cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc"} Mar 18 09:29:05 crc kubenswrapper[4778]: I0318 09:29:05.449326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerStarted","Data":"46e2104e09d595fc9eab76f4e7ebdca7f02555a48577717019a50c200d0c3244"} Mar 18 09:29:06 crc kubenswrapper[4778]: I0318 09:29:06.460944 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerStarted","Data":"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380"} Mar 18 09:29:09 crc kubenswrapper[4778]: I0318 09:29:09.513523 4778 generic.go:334] "Generic (PLEG): container finished" podID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerID="14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380" exitCode=0 Mar 18 09:29:09 crc kubenswrapper[4778]: I0318 09:29:09.513592 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerDied","Data":"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380"} Mar 18 09:29:10 crc kubenswrapper[4778]: I0318 09:29:10.527574 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerStarted","Data":"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8"} Mar 18 09:29:10 crc kubenswrapper[4778]: I0318 09:29:10.555459 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tpgxl" podStartSLOduration=2.052914615 podStartE2EDuration="6.555432121s" podCreationTimestamp="2026-03-18 09:29:04 +0000 UTC" firstStartedPulling="2026-03-18 09:29:05.451231686 +0000 UTC m=+1612.025976536" lastFinishedPulling="2026-03-18 09:29:09.953749202 +0000 UTC m=+1616.528494042" observedRunningTime="2026-03-18 09:29:10.55426695 +0000 UTC m=+1617.129011790" watchObservedRunningTime="2026-03-18 09:29:10.555432121 +0000 UTC m=+1617.130176981" Mar 18 09:29:14 crc kubenswrapper[4778]: I0318 09:29:14.391756 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:14 crc kubenswrapper[4778]: I0318 09:29:14.393039 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:15 crc kubenswrapper[4778]: I0318 09:29:15.465098 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tpgxl" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" probeResult="failure" output=< Mar 18 09:29:15 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:29:15 crc kubenswrapper[4778]: > Mar 18 09:29:25 crc kubenswrapper[4778]: I0318 09:29:25.439924 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tpgxl" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" probeResult="failure" output=< Mar 18 09:29:25 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:29:25 crc kubenswrapper[4778]: > Mar 18 09:29:28 crc kubenswrapper[4778]: I0318 09:29:28.914174 4778 scope.go:117] "RemoveContainer" containerID="b11383263b8711730419b8679c1b55038dd677778bfced742aebd89591fb64cf" Mar 18 09:29:28 crc kubenswrapper[4778]: I0318 09:29:28.982158 4778 scope.go:117] "RemoveContainer" containerID="14bdd0eaf1dbaf2dd412641a551329bb6a350b27bbb5a0fca3b8cd2009883565" Mar 18 09:29:29 crc kubenswrapper[4778]: I0318 09:29:29.023836 4778 scope.go:117] "RemoveContainer" containerID="a2a49acb877ac3d5291587410f4bdad35a27b8e7dc386fa78d21020a20cbe78c" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.147407 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.147854 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.147915 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.149026 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.149132 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" gracePeriod=600 Mar 18 09:29:30 crc kubenswrapper[4778]: E0318 09:29:30.277343 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.738359 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" exitCode=0 Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.738416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492"} Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.738469 4778 scope.go:117] "RemoveContainer" containerID="84af870879787d3b66c88494739078983637b86ff0a5bba3998c98c4487f8853" Mar 18 09:29:30 crc kubenswrapper[4778]: I0318 09:29:30.742781 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:29:30 crc kubenswrapper[4778]: E0318 09:29:30.744241 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:29:34 crc kubenswrapper[4778]: I0318 09:29:34.458956 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:34 crc kubenswrapper[4778]: I0318 09:29:34.532563 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:35 crc kubenswrapper[4778]: I0318 09:29:35.244889 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:35 crc kubenswrapper[4778]: I0318 09:29:35.796068 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tpgxl" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" containerID="cri-o://f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8" gracePeriod=2 Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.356075 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.492614 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content\") pod \"418d5e86-72b6-4030-b9e6-9d9402174c5c\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.492749 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities\") pod \"418d5e86-72b6-4030-b9e6-9d9402174c5c\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.492781 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md686\" (UniqueName: \"kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686\") pod \"418d5e86-72b6-4030-b9e6-9d9402174c5c\" (UID: \"418d5e86-72b6-4030-b9e6-9d9402174c5c\") " Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.493490 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities" (OuterVolumeSpecName: "utilities") pod "418d5e86-72b6-4030-b9e6-9d9402174c5c" (UID: "418d5e86-72b6-4030-b9e6-9d9402174c5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.502811 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686" (OuterVolumeSpecName: "kube-api-access-md686") pod "418d5e86-72b6-4030-b9e6-9d9402174c5c" (UID: "418d5e86-72b6-4030-b9e6-9d9402174c5c"). InnerVolumeSpecName "kube-api-access-md686". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.594673 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.594732 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md686\" (UniqueName: \"kubernetes.io/projected/418d5e86-72b6-4030-b9e6-9d9402174c5c-kube-api-access-md686\") on node \"crc\" DevicePath \"\"" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.632889 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "418d5e86-72b6-4030-b9e6-9d9402174c5c" (UID: "418d5e86-72b6-4030-b9e6-9d9402174c5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.696691 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418d5e86-72b6-4030-b9e6-9d9402174c5c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.813069 4778 generic.go:334] "Generic (PLEG): container finished" podID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerID="f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8" exitCode=0 Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.813120 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerDied","Data":"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8"} Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.813150 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpgxl" event={"ID":"418d5e86-72b6-4030-b9e6-9d9402174c5c","Type":"ContainerDied","Data":"46e2104e09d595fc9eab76f4e7ebdca7f02555a48577717019a50c200d0c3244"} Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.813173 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpgxl" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.813182 4778 scope.go:117] "RemoveContainer" containerID="f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.864549 4778 scope.go:117] "RemoveContainer" containerID="14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.894573 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.912540 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tpgxl"] Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.943087 4778 scope.go:117] "RemoveContainer" containerID="cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc" Mar 18 09:29:36 crc kubenswrapper[4778]: I0318 09:29:36.997494 4778 scope.go:117] "RemoveContainer" containerID="f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8" Mar 18 09:29:37 crc kubenswrapper[4778]: E0318 09:29:37.012357 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8\": container with ID starting with f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8 not found: ID does not exist" containerID="f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8" Mar 18 09:29:37 crc kubenswrapper[4778]: I0318 09:29:37.012415 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8"} err="failed to get container status \"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8\": rpc error: code = NotFound desc = could not find container \"f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8\": container with ID starting with f9d6fef0260d248366ef8560f4c9cd199662b966ac245e3a3ebc72de0fd3b5b8 not found: ID does not exist" Mar 18 09:29:37 crc kubenswrapper[4778]: I0318 09:29:37.012440 4778 scope.go:117] "RemoveContainer" containerID="14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380" Mar 18 09:29:37 crc kubenswrapper[4778]: E0318 09:29:37.021353 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380\": container with ID starting with 14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380 not found: ID does not exist" containerID="14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380" Mar 18 09:29:37 crc kubenswrapper[4778]: I0318 09:29:37.021603 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380"} err="failed to get container status \"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380\": rpc error: code = NotFound desc = could not find container \"14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380\": container with ID starting with 14490c2c148760714da83f347e931d8c1f14c80beeb64b62aaba231567738380 not found: ID does not exist" Mar 18 09:29:37 crc kubenswrapper[4778]: I0318 09:29:37.021689 4778 scope.go:117] "RemoveContainer" containerID="cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc" Mar 18 09:29:37 crc kubenswrapper[4778]: E0318 09:29:37.023396 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc\": container with ID starting with cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc not found: ID does not exist" containerID="cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc" Mar 18 09:29:37 crc kubenswrapper[4778]: I0318 09:29:37.023489 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc"} err="failed to get container status \"cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc\": rpc error: code = NotFound desc = could not find container \"cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc\": container with ID starting with cafd12ccce572175a54b37aead67ccc7ce0bc3a1ce1e3447068163f9ec8746fc not found: ID does not exist" Mar 18 09:29:38 crc kubenswrapper[4778]: I0318 09:29:38.197921 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" path="/var/lib/kubelet/pods/418d5e86-72b6-4030-b9e6-9d9402174c5c/volumes" Mar 18 09:29:45 crc kubenswrapper[4778]: I0318 09:29:45.186844 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:29:45 crc kubenswrapper[4778]: E0318 09:29:45.187786 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:29:59 crc kubenswrapper[4778]: I0318 09:29:59.188065 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:29:59 crc kubenswrapper[4778]: E0318 09:29:59.189295 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.163487 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq"] Mar 18 09:30:00 crc kubenswrapper[4778]: E0318 09:30:00.164177 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="extract-utilities" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.164243 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="extract-utilities" Mar 18 09:30:00 crc kubenswrapper[4778]: E0318 09:30:00.164287 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="extract-content" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.164302 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="extract-content" Mar 18 09:30:00 crc kubenswrapper[4778]: E0318 09:30:00.164339 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.164354 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.164731 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="418d5e86-72b6-4030-b9e6-9d9402174c5c" containerName="registry-server" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.165842 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.172013 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.172133 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.176651 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq"] Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.214536 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563770-5f9th"] Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.215931 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.218896 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.219319 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.219499 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.237282 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563770-5f9th"] Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.254695 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.254848 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlhxl\" (UniqueName: \"kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.254947 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.254975 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdclz\" (UniqueName: \"kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz\") pod \"auto-csr-approver-29563770-5f9th\" (UID: \"d5229065-e84e-4d42-870f-1ee468bff359\") " pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.356008 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlhxl\" (UniqueName: \"kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.356105 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.356140 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdclz\" (UniqueName: \"kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz\") pod \"auto-csr-approver-29563770-5f9th\" (UID: \"d5229065-e84e-4d42-870f-1ee468bff359\") " pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.356247 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.357123 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.366510 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.374754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdclz\" (UniqueName: \"kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz\") pod \"auto-csr-approver-29563770-5f9th\" (UID: \"d5229065-e84e-4d42-870f-1ee468bff359\") " pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.381395 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlhxl\" (UniqueName: \"kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl\") pod \"collect-profiles-29563770-zldnq\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.492072 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:00 crc kubenswrapper[4778]: I0318 09:30:00.534455 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:01 crc kubenswrapper[4778]: I0318 09:30:01.014072 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq"] Mar 18 09:30:01 crc kubenswrapper[4778]: I0318 09:30:01.083123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" event={"ID":"b6b2bddb-d94d-426e-bc18-8b864785e323","Type":"ContainerStarted","Data":"188edaa43cc00dcfb6862af68da42938b2b5d43a074ba7244be6903f72115021"} Mar 18 09:30:01 crc kubenswrapper[4778]: I0318 09:30:01.083762 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563770-5f9th"] Mar 18 09:30:01 crc kubenswrapper[4778]: W0318 09:30:01.095887 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5229065_e84e_4d42_870f_1ee468bff359.slice/crio-dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8 WatchSource:0}: Error finding container dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8: Status 404 returned error can't find the container with id dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8 Mar 18 09:30:02 crc kubenswrapper[4778]: I0318 09:30:02.099382 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563770-5f9th" event={"ID":"d5229065-e84e-4d42-870f-1ee468bff359","Type":"ContainerStarted","Data":"dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8"} Mar 18 09:30:02 crc kubenswrapper[4778]: I0318 09:30:02.102021 4778 generic.go:334] "Generic (PLEG): container finished" podID="b6b2bddb-d94d-426e-bc18-8b864785e323" containerID="04359ca445cb3566112d245be577eaabe4ab24e27a18fca03074e13b6e3b403f" exitCode=0 Mar 18 09:30:02 crc kubenswrapper[4778]: I0318 09:30:02.102065 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" event={"ID":"b6b2bddb-d94d-426e-bc18-8b864785e323","Type":"ContainerDied","Data":"04359ca445cb3566112d245be577eaabe4ab24e27a18fca03074e13b6e3b403f"} Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.489660 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.543777 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume\") pod \"b6b2bddb-d94d-426e-bc18-8b864785e323\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.544168 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume\") pod \"b6b2bddb-d94d-426e-bc18-8b864785e323\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.544411 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlhxl\" (UniqueName: \"kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl\") pod \"b6b2bddb-d94d-426e-bc18-8b864785e323\" (UID: \"b6b2bddb-d94d-426e-bc18-8b864785e323\") " Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.546381 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6b2bddb-d94d-426e-bc18-8b864785e323" (UID: "b6b2bddb-d94d-426e-bc18-8b864785e323"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.550522 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl" (OuterVolumeSpecName: "kube-api-access-xlhxl") pod "b6b2bddb-d94d-426e-bc18-8b864785e323" (UID: "b6b2bddb-d94d-426e-bc18-8b864785e323"). InnerVolumeSpecName "kube-api-access-xlhxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.575311 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6b2bddb-d94d-426e-bc18-8b864785e323" (UID: "b6b2bddb-d94d-426e-bc18-8b864785e323"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.647154 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlhxl\" (UniqueName: \"kubernetes.io/projected/b6b2bddb-d94d-426e-bc18-8b864785e323-kube-api-access-xlhxl\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.647179 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b2bddb-d94d-426e-bc18-8b864785e323-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:03 crc kubenswrapper[4778]: I0318 09:30:03.647188 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b2bddb-d94d-426e-bc18-8b864785e323-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:04 crc kubenswrapper[4778]: I0318 09:30:04.121618 4778 generic.go:334] "Generic (PLEG): container finished" podID="d5229065-e84e-4d42-870f-1ee468bff359" containerID="8faf9c7a656879007008e10d6b7f5d22a002ddd8fac9065c9f561e0e336487fd" exitCode=0 Mar 18 09:30:04 crc kubenswrapper[4778]: I0318 09:30:04.121948 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563770-5f9th" event={"ID":"d5229065-e84e-4d42-870f-1ee468bff359","Type":"ContainerDied","Data":"8faf9c7a656879007008e10d6b7f5d22a002ddd8fac9065c9f561e0e336487fd"} Mar 18 09:30:04 crc kubenswrapper[4778]: I0318 09:30:04.124679 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" event={"ID":"b6b2bddb-d94d-426e-bc18-8b864785e323","Type":"ContainerDied","Data":"188edaa43cc00dcfb6862af68da42938b2b5d43a074ba7244be6903f72115021"} Mar 18 09:30:04 crc kubenswrapper[4778]: I0318 09:30:04.124724 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="188edaa43cc00dcfb6862af68da42938b2b5d43a074ba7244be6903f72115021" Mar 18 09:30:04 crc kubenswrapper[4778]: I0318 09:30:04.124824 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq" Mar 18 09:30:05 crc kubenswrapper[4778]: I0318 09:30:05.489022 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:05 crc kubenswrapper[4778]: I0318 09:30:05.581020 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdclz\" (UniqueName: \"kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz\") pod \"d5229065-e84e-4d42-870f-1ee468bff359\" (UID: \"d5229065-e84e-4d42-870f-1ee468bff359\") " Mar 18 09:30:05 crc kubenswrapper[4778]: I0318 09:30:05.588002 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz" (OuterVolumeSpecName: "kube-api-access-sdclz") pod "d5229065-e84e-4d42-870f-1ee468bff359" (UID: "d5229065-e84e-4d42-870f-1ee468bff359"). InnerVolumeSpecName "kube-api-access-sdclz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:30:05 crc kubenswrapper[4778]: I0318 09:30:05.683814 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdclz\" (UniqueName: \"kubernetes.io/projected/d5229065-e84e-4d42-870f-1ee468bff359-kube-api-access-sdclz\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:06 crc kubenswrapper[4778]: I0318 09:30:06.153138 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563770-5f9th" event={"ID":"d5229065-e84e-4d42-870f-1ee468bff359","Type":"ContainerDied","Data":"dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8"} Mar 18 09:30:06 crc kubenswrapper[4778]: I0318 09:30:06.153185 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-5f9th" Mar 18 09:30:06 crc kubenswrapper[4778]: I0318 09:30:06.153188 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcd92a70f4238a237b5e0cd6ca40b1127dff72982d1f0f31b6ea951fbacd43e8" Mar 18 09:30:06 crc kubenswrapper[4778]: I0318 09:30:06.577685 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-s4crc"] Mar 18 09:30:06 crc kubenswrapper[4778]: I0318 09:30:06.589749 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-s4crc"] Mar 18 09:30:08 crc kubenswrapper[4778]: I0318 09:30:08.200668 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb3b2d75-fc85-48dc-8533-18ecd8c75187" path="/var/lib/kubelet/pods/bb3b2d75-fc85-48dc-8533-18ecd8c75187/volumes" Mar 18 09:30:13 crc kubenswrapper[4778]: I0318 09:30:13.187690 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:30:13 crc kubenswrapper[4778]: E0318 09:30:13.188495 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:30:26 crc kubenswrapper[4778]: I0318 09:30:26.189186 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:30:26 crc kubenswrapper[4778]: E0318 09:30:26.190154 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:30:29 crc kubenswrapper[4778]: I0318 09:30:29.108169 4778 scope.go:117] "RemoveContainer" containerID="2ec42f2618fc279e3b11e295de9307609aec937968481eb47ae40bf89eeec176" Mar 18 09:30:39 crc kubenswrapper[4778]: I0318 09:30:39.187968 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:30:39 crc kubenswrapper[4778]: E0318 09:30:39.188935 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:30:52 crc kubenswrapper[4778]: I0318 09:30:52.188700 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:30:52 crc kubenswrapper[4778]: E0318 09:30:52.189393 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:31:04 crc kubenswrapper[4778]: I0318 09:31:04.770577 4778 generic.go:334] "Generic (PLEG): container finished" podID="b989f767-d1ba-49fe-aebb-6aef120e0e22" containerID="df748c22cbd9cfd719213cf439a446ed8f2c405ec832bdc5f38d5aacebbce9a9" exitCode=0 Mar 18 09:31:04 crc kubenswrapper[4778]: I0318 09:31:04.770659 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" event={"ID":"b989f767-d1ba-49fe-aebb-6aef120e0e22","Type":"ContainerDied","Data":"df748c22cbd9cfd719213cf439a446ed8f2c405ec832bdc5f38d5aacebbce9a9"} Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.223948 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.385339 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle\") pod \"b989f767-d1ba-49fe-aebb-6aef120e0e22\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.385402 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbscl\" (UniqueName: \"kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl\") pod \"b989f767-d1ba-49fe-aebb-6aef120e0e22\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.385494 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory\") pod \"b989f767-d1ba-49fe-aebb-6aef120e0e22\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.385527 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam\") pod \"b989f767-d1ba-49fe-aebb-6aef120e0e22\" (UID: \"b989f767-d1ba-49fe-aebb-6aef120e0e22\") " Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.393317 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b989f767-d1ba-49fe-aebb-6aef120e0e22" (UID: "b989f767-d1ba-49fe-aebb-6aef120e0e22"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.405679 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl" (OuterVolumeSpecName: "kube-api-access-kbscl") pod "b989f767-d1ba-49fe-aebb-6aef120e0e22" (UID: "b989f767-d1ba-49fe-aebb-6aef120e0e22"). InnerVolumeSpecName "kube-api-access-kbscl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.414410 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory" (OuterVolumeSpecName: "inventory") pod "b989f767-d1ba-49fe-aebb-6aef120e0e22" (UID: "b989f767-d1ba-49fe-aebb-6aef120e0e22"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.432904 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b989f767-d1ba-49fe-aebb-6aef120e0e22" (UID: "b989f767-d1ba-49fe-aebb-6aef120e0e22"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.486981 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.487014 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.487025 4778 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989f767-d1ba-49fe-aebb-6aef120e0e22-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.487034 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbscl\" (UniqueName: \"kubernetes.io/projected/b989f767-d1ba-49fe-aebb-6aef120e0e22-kube-api-access-kbscl\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.801804 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" event={"ID":"b989f767-d1ba-49fe-aebb-6aef120e0e22","Type":"ContainerDied","Data":"b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa"} Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.801862 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5bc788ebd6b1883d8a8fc2b7f9f391272c9ac7329e6e4d8ed1d79fca68aaffa" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.801926 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.917515 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf"] Mar 18 09:31:06 crc kubenswrapper[4778]: E0318 09:31:06.918067 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5229065-e84e-4d42-870f-1ee468bff359" containerName="oc" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.918098 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5229065-e84e-4d42-870f-1ee468bff359" containerName="oc" Mar 18 09:31:06 crc kubenswrapper[4778]: E0318 09:31:06.918136 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b989f767-d1ba-49fe-aebb-6aef120e0e22" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.918152 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b989f767-d1ba-49fe-aebb-6aef120e0e22" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:31:06 crc kubenswrapper[4778]: E0318 09:31:06.918187 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b2bddb-d94d-426e-bc18-8b864785e323" containerName="collect-profiles" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.918224 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b2bddb-d94d-426e-bc18-8b864785e323" containerName="collect-profiles" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.919388 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b989f767-d1ba-49fe-aebb-6aef120e0e22" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.919444 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b2bddb-d94d-426e-bc18-8b864785e323" containerName="collect-profiles" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.919476 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5229065-e84e-4d42-870f-1ee468bff359" containerName="oc" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.920648 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.926792 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.926924 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.927727 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.928103 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.937527 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf"] Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.996839 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.996989 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlmz6\" (UniqueName: \"kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:06 crc kubenswrapper[4778]: I0318 09:31:06.997163 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.099426 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.099537 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlmz6\" (UniqueName: \"kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.099634 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.106073 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.113997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.129181 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlmz6\" (UniqueName: \"kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.186945 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:31:07 crc kubenswrapper[4778]: E0318 09:31:07.187213 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.240930 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:31:07 crc kubenswrapper[4778]: I0318 09:31:07.946038 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf"] Mar 18 09:31:08 crc kubenswrapper[4778]: I0318 09:31:08.827930 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" event={"ID":"2fe04bef-41cb-47c4-8031-141f8809e8cb","Type":"ContainerStarted","Data":"448fd951e8632e6cb54458ae0feebd671c32acd586b867a21adf3d37b278a94c"} Mar 18 09:31:09 crc kubenswrapper[4778]: I0318 09:31:09.841476 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" event={"ID":"2fe04bef-41cb-47c4-8031-141f8809e8cb","Type":"ContainerStarted","Data":"fd288c3256024cadd2bb212c37d37772aadd4cda1a6ce7e57e524f08cb5c87a0"} Mar 18 09:31:09 crc kubenswrapper[4778]: I0318 09:31:09.872374 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" podStartSLOduration=2.351412167 podStartE2EDuration="3.872353567s" podCreationTimestamp="2026-03-18 09:31:06 +0000 UTC" firstStartedPulling="2026-03-18 09:31:07.951825746 +0000 UTC m=+1734.526570606" lastFinishedPulling="2026-03-18 09:31:09.472767126 +0000 UTC m=+1736.047512006" observedRunningTime="2026-03-18 09:31:09.860782844 +0000 UTC m=+1736.435527684" watchObservedRunningTime="2026-03-18 09:31:09.872353567 +0000 UTC m=+1736.447098407" Mar 18 09:31:20 crc kubenswrapper[4778]: I0318 09:31:20.188343 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:31:20 crc kubenswrapper[4778]: E0318 09:31:20.189612 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:31:31 crc kubenswrapper[4778]: I0318 09:31:31.187407 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:31:31 crc kubenswrapper[4778]: E0318 09:31:31.188265 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:31:44 crc kubenswrapper[4778]: I0318 09:31:44.200854 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:31:44 crc kubenswrapper[4778]: E0318 09:31:44.203762 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:31:56 crc kubenswrapper[4778]: I0318 09:31:56.189679 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:31:56 crc kubenswrapper[4778]: E0318 09:31:56.190708 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.163715 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563772-tcj6t"] Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.166812 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.169104 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.169316 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.169519 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.183092 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563772-tcj6t"] Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.314729 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrt4f\" (UniqueName: \"kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f\") pod \"auto-csr-approver-29563772-tcj6t\" (UID: \"15232b66-3433-4405-9feb-79055e892b3d\") " pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.416469 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrt4f\" (UniqueName: \"kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f\") pod \"auto-csr-approver-29563772-tcj6t\" (UID: \"15232b66-3433-4405-9feb-79055e892b3d\") " pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.438592 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrt4f\" (UniqueName: \"kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f\") pod \"auto-csr-approver-29563772-tcj6t\" (UID: \"15232b66-3433-4405-9feb-79055e892b3d\") " pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:00 crc kubenswrapper[4778]: I0318 09:32:00.491260 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:01 crc kubenswrapper[4778]: I0318 09:32:01.055301 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563772-tcj6t"] Mar 18 09:32:01 crc kubenswrapper[4778]: I0318 09:32:01.413688 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" event={"ID":"15232b66-3433-4405-9feb-79055e892b3d","Type":"ContainerStarted","Data":"db674ce39ebaf74f85133b048779d3fd48c7602d67b049f235224eed5592d6a3"} Mar 18 09:32:03 crc kubenswrapper[4778]: I0318 09:32:03.439063 4778 generic.go:334] "Generic (PLEG): container finished" podID="15232b66-3433-4405-9feb-79055e892b3d" containerID="c8ccd760df68dbd5ce4bef875e9b41962b50e1c9d6413d0f1f66a324748d7c49" exitCode=0 Mar 18 09:32:03 crc kubenswrapper[4778]: I0318 09:32:03.439180 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" event={"ID":"15232b66-3433-4405-9feb-79055e892b3d","Type":"ContainerDied","Data":"c8ccd760df68dbd5ce4bef875e9b41962b50e1c9d6413d0f1f66a324748d7c49"} Mar 18 09:32:04 crc kubenswrapper[4778]: I0318 09:32:04.771444 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:04 crc kubenswrapper[4778]: I0318 09:32:04.906384 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrt4f\" (UniqueName: \"kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f\") pod \"15232b66-3433-4405-9feb-79055e892b3d\" (UID: \"15232b66-3433-4405-9feb-79055e892b3d\") " Mar 18 09:32:04 crc kubenswrapper[4778]: I0318 09:32:04.916575 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f" (OuterVolumeSpecName: "kube-api-access-zrt4f") pod "15232b66-3433-4405-9feb-79055e892b3d" (UID: "15232b66-3433-4405-9feb-79055e892b3d"). InnerVolumeSpecName "kube-api-access-zrt4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.009037 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrt4f\" (UniqueName: \"kubernetes.io/projected/15232b66-3433-4405-9feb-79055e892b3d-kube-api-access-zrt4f\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.465472 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" event={"ID":"15232b66-3433-4405-9feb-79055e892b3d","Type":"ContainerDied","Data":"db674ce39ebaf74f85133b048779d3fd48c7602d67b049f235224eed5592d6a3"} Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.465528 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db674ce39ebaf74f85133b048779d3fd48c7602d67b049f235224eed5592d6a3" Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.465579 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-tcj6t" Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.859670 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-lqhxm"] Mar 18 09:32:05 crc kubenswrapper[4778]: I0318 09:32:05.868755 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-lqhxm"] Mar 18 09:32:06 crc kubenswrapper[4778]: I0318 09:32:06.198978 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa" path="/var/lib/kubelet/pods/ef88b46a-0eac-4885-83f8-1ad9b9c0b9aa/volumes" Mar 18 09:32:07 crc kubenswrapper[4778]: I0318 09:32:07.031470 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5387-account-create-update-wm5k5"] Mar 18 09:32:07 crc kubenswrapper[4778]: I0318 09:32:07.041717 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-dz4jc"] Mar 18 09:32:07 crc kubenswrapper[4778]: I0318 09:32:07.049731 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5387-account-create-update-wm5k5"] Mar 18 09:32:07 crc kubenswrapper[4778]: I0318 09:32:07.058526 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-dz4jc"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.030987 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-l6vl8"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.040055 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a222-account-create-update-qr82t"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.051713 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a222-account-create-update-qr82t"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.061591 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-l6vl8"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.071861 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0a46-account-create-update-phb5p"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.079437 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0a46-account-create-update-phb5p"] Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.280056 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24412394-390b-461c-9d18-617eba706adc" path="/var/lib/kubelet/pods/24412394-390b-461c-9d18-617eba706adc/volumes" Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.280735 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="445fcacb-d2c9-4892-89b5-4b2b6e54ebc9" path="/var/lib/kubelet/pods/445fcacb-d2c9-4892-89b5-4b2b6e54ebc9/volumes" Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.281281 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a820a6-6a95-4ab7-a9d8-6649fe45464a" path="/var/lib/kubelet/pods/51a820a6-6a95-4ab7-a9d8-6649fe45464a/volumes" Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.281800 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ae14ca-efde-42ba-8edf-7cc34dc31036" path="/var/lib/kubelet/pods/c7ae14ca-efde-42ba-8edf-7cc34dc31036/volumes" Mar 18 09:32:08 crc kubenswrapper[4778]: I0318 09:32:08.282839 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9222d9a-6507-4c32-9234-2c1c2b27a11e" path="/var/lib/kubelet/pods/f9222d9a-6507-4c32-9234-2c1c2b27a11e/volumes" Mar 18 09:32:09 crc kubenswrapper[4778]: I0318 09:32:09.046924 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kj8ww"] Mar 18 09:32:09 crc kubenswrapper[4778]: I0318 09:32:09.062744 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kj8ww"] Mar 18 09:32:09 crc kubenswrapper[4778]: I0318 09:32:09.188959 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:32:09 crc kubenswrapper[4778]: E0318 09:32:09.191475 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:32:10 crc kubenswrapper[4778]: I0318 09:32:10.202146 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be311af4-91f5-417e-971b-c9158576ca97" path="/var/lib/kubelet/pods/be311af4-91f5-417e-971b-c9158576ca97/volumes" Mar 18 09:32:17 crc kubenswrapper[4778]: I0318 09:32:17.582710 4778 generic.go:334] "Generic (PLEG): container finished" podID="2fe04bef-41cb-47c4-8031-141f8809e8cb" containerID="fd288c3256024cadd2bb212c37d37772aadd4cda1a6ce7e57e524f08cb5c87a0" exitCode=0 Mar 18 09:32:17 crc kubenswrapper[4778]: I0318 09:32:17.582811 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" event={"ID":"2fe04bef-41cb-47c4-8031-141f8809e8cb","Type":"ContainerDied","Data":"fd288c3256024cadd2bb212c37d37772aadd4cda1a6ce7e57e524f08cb5c87a0"} Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.062306 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.215697 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory\") pod \"2fe04bef-41cb-47c4-8031-141f8809e8cb\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.215972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam\") pod \"2fe04bef-41cb-47c4-8031-141f8809e8cb\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.216045 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlmz6\" (UniqueName: \"kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6\") pod \"2fe04bef-41cb-47c4-8031-141f8809e8cb\" (UID: \"2fe04bef-41cb-47c4-8031-141f8809e8cb\") " Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.240875 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6" (OuterVolumeSpecName: "kube-api-access-wlmz6") pod "2fe04bef-41cb-47c4-8031-141f8809e8cb" (UID: "2fe04bef-41cb-47c4-8031-141f8809e8cb"). InnerVolumeSpecName "kube-api-access-wlmz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.247404 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory" (OuterVolumeSpecName: "inventory") pod "2fe04bef-41cb-47c4-8031-141f8809e8cb" (UID: "2fe04bef-41cb-47c4-8031-141f8809e8cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.267345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2fe04bef-41cb-47c4-8031-141f8809e8cb" (UID: "2fe04bef-41cb-47c4-8031-141f8809e8cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.318950 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.319010 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlmz6\" (UniqueName: \"kubernetes.io/projected/2fe04bef-41cb-47c4-8031-141f8809e8cb-kube-api-access-wlmz6\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.319032 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe04bef-41cb-47c4-8031-141f8809e8cb-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.608421 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" event={"ID":"2fe04bef-41cb-47c4-8031-141f8809e8cb","Type":"ContainerDied","Data":"448fd951e8632e6cb54458ae0feebd671c32acd586b867a21adf3d37b278a94c"} Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.608483 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.608509 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="448fd951e8632e6cb54458ae0feebd671c32acd586b867a21adf3d37b278a94c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.692029 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c"] Mar 18 09:32:19 crc kubenswrapper[4778]: E0318 09:32:19.692514 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15232b66-3433-4405-9feb-79055e892b3d" containerName="oc" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.692543 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="15232b66-3433-4405-9feb-79055e892b3d" containerName="oc" Mar 18 09:32:19 crc kubenswrapper[4778]: E0318 09:32:19.692582 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe04bef-41cb-47c4-8031-141f8809e8cb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.692597 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe04bef-41cb-47c4-8031-141f8809e8cb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.692892 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe04bef-41cb-47c4-8031-141f8809e8cb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.692940 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="15232b66-3433-4405-9feb-79055e892b3d" containerName="oc" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.693651 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.695523 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.696021 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.696841 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.700253 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.713477 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c"] Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.837545 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.837595 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdgb8\" (UniqueName: \"kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.837632 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.939887 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.939983 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdgb8\" (UniqueName: \"kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.940057 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.944054 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.944218 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:19 crc kubenswrapper[4778]: I0318 09:32:19.956850 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdgb8\" (UniqueName: \"kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:20 crc kubenswrapper[4778]: I0318 09:32:20.009250 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:20 crc kubenswrapper[4778]: I0318 09:32:20.557189 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c"] Mar 18 09:32:20 crc kubenswrapper[4778]: I0318 09:32:20.559534 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:32:20 crc kubenswrapper[4778]: I0318 09:32:20.618676 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" event={"ID":"8e3f07f1-8381-48a9-8ebb-9cd3a821783f","Type":"ContainerStarted","Data":"38f16aa0f38a4d4d585e71d4489ff45542b89cbc52957dcc955eb4dbf7f944b1"} Mar 18 09:32:21 crc kubenswrapper[4778]: I0318 09:32:21.626671 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" event={"ID":"8e3f07f1-8381-48a9-8ebb-9cd3a821783f","Type":"ContainerStarted","Data":"624102bc1fa0c3850d7e6900b631b277f6bac2b359e2ccf5e40af6d9d87d6742"} Mar 18 09:32:21 crc kubenswrapper[4778]: I0318 09:32:21.657637 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" podStartSLOduration=2.213186226 podStartE2EDuration="2.657612309s" podCreationTimestamp="2026-03-18 09:32:19 +0000 UTC" firstStartedPulling="2026-03-18 09:32:20.559130028 +0000 UTC m=+1807.133874878" lastFinishedPulling="2026-03-18 09:32:21.003556091 +0000 UTC m=+1807.578300961" observedRunningTime="2026-03-18 09:32:21.638654727 +0000 UTC m=+1808.213399617" watchObservedRunningTime="2026-03-18 09:32:21.657612309 +0000 UTC m=+1808.232357159" Mar 18 09:32:23 crc kubenswrapper[4778]: I0318 09:32:23.187322 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:32:23 crc kubenswrapper[4778]: E0318 09:32:23.187680 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:32:25 crc kubenswrapper[4778]: I0318 09:32:25.676692 4778 generic.go:334] "Generic (PLEG): container finished" podID="8e3f07f1-8381-48a9-8ebb-9cd3a821783f" containerID="624102bc1fa0c3850d7e6900b631b277f6bac2b359e2ccf5e40af6d9d87d6742" exitCode=0 Mar 18 09:32:25 crc kubenswrapper[4778]: I0318 09:32:25.676880 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" event={"ID":"8e3f07f1-8381-48a9-8ebb-9cd3a821783f","Type":"ContainerDied","Data":"624102bc1fa0c3850d7e6900b631b277f6bac2b359e2ccf5e40af6d9d87d6742"} Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.257602 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.381915 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam\") pod \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.382001 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory\") pod \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.382034 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdgb8\" (UniqueName: \"kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8\") pod \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\" (UID: \"8e3f07f1-8381-48a9-8ebb-9cd3a821783f\") " Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.389223 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8" (OuterVolumeSpecName: "kube-api-access-vdgb8") pod "8e3f07f1-8381-48a9-8ebb-9cd3a821783f" (UID: "8e3f07f1-8381-48a9-8ebb-9cd3a821783f"). InnerVolumeSpecName "kube-api-access-vdgb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.418029 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8e3f07f1-8381-48a9-8ebb-9cd3a821783f" (UID: "8e3f07f1-8381-48a9-8ebb-9cd3a821783f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.420614 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory" (OuterVolumeSpecName: "inventory") pod "8e3f07f1-8381-48a9-8ebb-9cd3a821783f" (UID: "8e3f07f1-8381-48a9-8ebb-9cd3a821783f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.484884 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.484921 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.484933 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdgb8\" (UniqueName: \"kubernetes.io/projected/8e3f07f1-8381-48a9-8ebb-9cd3a821783f-kube-api-access-vdgb8\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.698778 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" event={"ID":"8e3f07f1-8381-48a9-8ebb-9cd3a821783f","Type":"ContainerDied","Data":"38f16aa0f38a4d4d585e71d4489ff45542b89cbc52957dcc955eb4dbf7f944b1"} Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.699122 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38f16aa0f38a4d4d585e71d4489ff45542b89cbc52957dcc955eb4dbf7f944b1" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.698938 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.841893 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk"] Mar 18 09:32:27 crc kubenswrapper[4778]: E0318 09:32:27.842346 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e3f07f1-8381-48a9-8ebb-9cd3a821783f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.842369 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e3f07f1-8381-48a9-8ebb-9cd3a821783f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.842586 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e3f07f1-8381-48a9-8ebb-9cd3a821783f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.843324 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.847364 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.847611 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.847972 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.849471 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.862078 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk"] Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.893275 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.893510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.893588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjp6\" (UniqueName: \"kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.994795 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.995075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.995185 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjp6\" (UniqueName: \"kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:27 crc kubenswrapper[4778]: I0318 09:32:27.998764 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:28 crc kubenswrapper[4778]: I0318 09:32:28.000556 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:28 crc kubenswrapper[4778]: I0318 09:32:28.023523 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjp6\" (UniqueName: \"kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xhchk\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:28 crc kubenswrapper[4778]: I0318 09:32:28.164053 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:32:28 crc kubenswrapper[4778]: I0318 09:32:28.758701 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk"] Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.252361 4778 scope.go:117] "RemoveContainer" containerID="70fd7ebf08e80da75227c830c21c112cbd85c345b44bad8cd8c81cd3f4b7fd7e" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.275895 4778 scope.go:117] "RemoveContainer" containerID="c677c641e634b2b60d1bae546bbf8e8d9cc5553eb522dd4d67a26e72bf3f0752" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.329665 4778 scope.go:117] "RemoveContainer" containerID="2d1134d737bb1ad4d1096a8735192a53e75e70709be8071f894f8def68f8db65" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.449695 4778 scope.go:117] "RemoveContainer" containerID="9b6fa295a9bfec83f890b1dd7210afd8d93f1d1f4da240bfbc29bf8af750edd0" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.512569 4778 scope.go:117] "RemoveContainer" containerID="397a643cfe9ac0a1d5786dfe10182c0fd656474f01a6f126523a52404ea544a7" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.547469 4778 scope.go:117] "RemoveContainer" containerID="c24773f49ad71f17b93d5ac7609065bb82dac185ae78530f1dcf0ecca87ade20" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.584412 4778 scope.go:117] "RemoveContainer" containerID="39a152a6bb8ed07675c14ece0ad21851da7a9e9a103afe2312ea0254cd99b29c" Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.727172 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" event={"ID":"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803","Type":"ContainerStarted","Data":"9d9ac3d0a2c7513d5a45e8e6d19a1411862166adf069327b3e174ecc2c3c3a28"} Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.727685 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" event={"ID":"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803","Type":"ContainerStarted","Data":"80e8fcc25351430440263b6d5eaa255eb52e520388064f88dde7e70815ffe72a"} Mar 18 09:32:29 crc kubenswrapper[4778]: I0318 09:32:29.755092 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" podStartSLOduration=2.178081733 podStartE2EDuration="2.755070199s" podCreationTimestamp="2026-03-18 09:32:27 +0000 UTC" firstStartedPulling="2026-03-18 09:32:28.766137988 +0000 UTC m=+1815.340882818" lastFinishedPulling="2026-03-18 09:32:29.343126434 +0000 UTC m=+1815.917871284" observedRunningTime="2026-03-18 09:32:29.739888508 +0000 UTC m=+1816.314633358" watchObservedRunningTime="2026-03-18 09:32:29.755070199 +0000 UTC m=+1816.329815049" Mar 18 09:32:32 crc kubenswrapper[4778]: I0318 09:32:32.056335 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zrjls"] Mar 18 09:32:32 crc kubenswrapper[4778]: I0318 09:32:32.093737 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zrjls"] Mar 18 09:32:32 crc kubenswrapper[4778]: I0318 09:32:32.199339 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8560ebac-334f-4332-b324-cdb297a94b1a" path="/var/lib/kubelet/pods/8560ebac-334f-4332-b324-cdb297a94b1a/volumes" Mar 18 09:32:37 crc kubenswrapper[4778]: I0318 09:32:37.038055 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-b66ph"] Mar 18 09:32:37 crc kubenswrapper[4778]: I0318 09:32:37.050132 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-b66ph"] Mar 18 09:32:37 crc kubenswrapper[4778]: I0318 09:32:37.188093 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:32:37 crc kubenswrapper[4778]: E0318 09:32:37.188516 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:32:38 crc kubenswrapper[4778]: I0318 09:32:38.201370 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dadb643-21f7-497a-992f-41ab80c704c5" path="/var/lib/kubelet/pods/5dadb643-21f7-497a-992f-41ab80c704c5/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.060547 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2cxtn"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.072036 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sz5dt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.082783 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b89b-account-create-update-ff8z8"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.091155 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-q979b"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.098001 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4e8f-account-create-update-ztvnt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.104605 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6980-account-create-update-8lctt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.111408 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2cxtn"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.117932 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sz5dt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.125507 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4e8f-account-create-update-ztvnt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.132178 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b89b-account-create-update-ff8z8"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.139424 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6980-account-create-update-8lctt"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.147254 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-q979b"] Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.198004 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320c5adc-a7d8-47a3-893b-7614c755446d" path="/var/lib/kubelet/pods/320c5adc-a7d8-47a3-893b-7614c755446d/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.198756 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e" path="/var/lib/kubelet/pods/35ab62b8-3275-4b3b-b7a4-bfeb14f0e74e/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.199410 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60fea5d6-a85d-40e3-81ef-1d499ba2ebf7" path="/var/lib/kubelet/pods/60fea5d6-a85d-40e3-81ef-1d499ba2ebf7/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.199959 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf66d17-48b6-4629-ae0c-e270afa0c88a" path="/var/lib/kubelet/pods/7cf66d17-48b6-4629-ae0c-e270afa0c88a/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.201032 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9719662a-4248-4c3c-860b-1a9e6547876b" path="/var/lib/kubelet/pods/9719662a-4248-4c3c-860b-1a9e6547876b/volumes" Mar 18 09:32:48 crc kubenswrapper[4778]: I0318 09:32:48.201626 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca6e4b2-4722-4a45-b577-33f3c5090fc3" path="/var/lib/kubelet/pods/dca6e4b2-4722-4a45-b577-33f3c5090fc3/volumes" Mar 18 09:32:50 crc kubenswrapper[4778]: I0318 09:32:50.187505 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:32:50 crc kubenswrapper[4778]: E0318 09:32:50.188063 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:32:53 crc kubenswrapper[4778]: I0318 09:32:53.039661 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-29tr5"] Mar 18 09:32:53 crc kubenswrapper[4778]: I0318 09:32:53.048902 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-29tr5"] Mar 18 09:32:54 crc kubenswrapper[4778]: I0318 09:32:54.198845 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6efa68-d15c-4d69-bd52-853a7cef8299" path="/var/lib/kubelet/pods/bb6efa68-d15c-4d69-bd52-853a7cef8299/volumes" Mar 18 09:33:03 crc kubenswrapper[4778]: I0318 09:33:03.187694 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:33:03 crc kubenswrapper[4778]: E0318 09:33:03.188513 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:33:08 crc kubenswrapper[4778]: I0318 09:33:08.156900 4778 generic.go:334] "Generic (PLEG): container finished" podID="9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" containerID="9d9ac3d0a2c7513d5a45e8e6d19a1411862166adf069327b3e174ecc2c3c3a28" exitCode=0 Mar 18 09:33:08 crc kubenswrapper[4778]: I0318 09:33:08.156995 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" event={"ID":"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803","Type":"ContainerDied","Data":"9d9ac3d0a2c7513d5a45e8e6d19a1411862166adf069327b3e174ecc2c3c3a28"} Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.576091 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.651409 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam\") pod \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.676709 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" (UID: "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.752937 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory\") pod \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.753120 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsjp6\" (UniqueName: \"kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6\") pod \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\" (UID: \"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803\") " Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.753523 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.758060 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6" (OuterVolumeSpecName: "kube-api-access-vsjp6") pod "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" (UID: "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803"). InnerVolumeSpecName "kube-api-access-vsjp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.778491 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory" (OuterVolumeSpecName: "inventory") pod "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" (UID: "9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.855585 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsjp6\" (UniqueName: \"kubernetes.io/projected/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-kube-api-access-vsjp6\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:09 crc kubenswrapper[4778]: I0318 09:33:09.855618 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.184160 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" event={"ID":"9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803","Type":"ContainerDied","Data":"80e8fcc25351430440263b6d5eaa255eb52e520388064f88dde7e70815ffe72a"} Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.184272 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80e8fcc25351430440263b6d5eaa255eb52e520388064f88dde7e70815ffe72a" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.184311 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.280327 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf"] Mar 18 09:33:10 crc kubenswrapper[4778]: E0318 09:33:10.280835 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.280850 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.281030 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.281564 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.284439 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.284970 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.285458 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.288986 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.300910 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf"] Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.365591 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.365670 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.365871 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwvbs\" (UniqueName: \"kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.467371 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.467598 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwvbs\" (UniqueName: \"kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.467657 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.473410 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.474143 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.488636 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwvbs\" (UniqueName: \"kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:10 crc kubenswrapper[4778]: I0318 09:33:10.648778 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:11 crc kubenswrapper[4778]: I0318 09:33:11.002337 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf"] Mar 18 09:33:11 crc kubenswrapper[4778]: I0318 09:33:11.194191 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" event={"ID":"b33de03b-23ec-40c0-b309-0dd2024caf71","Type":"ContainerStarted","Data":"440d34e0cd124a64005a33484aec00463b1a2e180f1850432f84d55112a37b77"} Mar 18 09:33:12 crc kubenswrapper[4778]: I0318 09:33:12.205976 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" event={"ID":"b33de03b-23ec-40c0-b309-0dd2024caf71","Type":"ContainerStarted","Data":"fff08d52bb8eb70989a438cee10bfa00abf8f2f0d3255a5a38ba7fa52ac6fd7d"} Mar 18 09:33:12 crc kubenswrapper[4778]: I0318 09:33:12.232320 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" podStartSLOduration=1.7412338269999998 podStartE2EDuration="2.232300401s" podCreationTimestamp="2026-03-18 09:33:10 +0000 UTC" firstStartedPulling="2026-03-18 09:33:11.007530707 +0000 UTC m=+1857.582275547" lastFinishedPulling="2026-03-18 09:33:11.498597231 +0000 UTC m=+1858.073342121" observedRunningTime="2026-03-18 09:33:12.231578962 +0000 UTC m=+1858.806323812" watchObservedRunningTime="2026-03-18 09:33:12.232300401 +0000 UTC m=+1858.807045251" Mar 18 09:33:16 crc kubenswrapper[4778]: I0318 09:33:16.187754 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:33:16 crc kubenswrapper[4778]: E0318 09:33:16.188646 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:33:16 crc kubenswrapper[4778]: I0318 09:33:16.260500 4778 generic.go:334] "Generic (PLEG): container finished" podID="b33de03b-23ec-40c0-b309-0dd2024caf71" containerID="fff08d52bb8eb70989a438cee10bfa00abf8f2f0d3255a5a38ba7fa52ac6fd7d" exitCode=0 Mar 18 09:33:16 crc kubenswrapper[4778]: I0318 09:33:16.260560 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" event={"ID":"b33de03b-23ec-40c0-b309-0dd2024caf71","Type":"ContainerDied","Data":"fff08d52bb8eb70989a438cee10bfa00abf8f2f0d3255a5a38ba7fa52ac6fd7d"} Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.781541 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.947437 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory\") pod \"b33de03b-23ec-40c0-b309-0dd2024caf71\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.947611 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwvbs\" (UniqueName: \"kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs\") pod \"b33de03b-23ec-40c0-b309-0dd2024caf71\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.947674 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam\") pod \"b33de03b-23ec-40c0-b309-0dd2024caf71\" (UID: \"b33de03b-23ec-40c0-b309-0dd2024caf71\") " Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.954617 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs" (OuterVolumeSpecName: "kube-api-access-bwvbs") pod "b33de03b-23ec-40c0-b309-0dd2024caf71" (UID: "b33de03b-23ec-40c0-b309-0dd2024caf71"). InnerVolumeSpecName "kube-api-access-bwvbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.988506 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory" (OuterVolumeSpecName: "inventory") pod "b33de03b-23ec-40c0-b309-0dd2024caf71" (UID: "b33de03b-23ec-40c0-b309-0dd2024caf71"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:33:17 crc kubenswrapper[4778]: I0318 09:33:17.995918 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b33de03b-23ec-40c0-b309-0dd2024caf71" (UID: "b33de03b-23ec-40c0-b309-0dd2024caf71"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.050238 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.050312 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwvbs\" (UniqueName: \"kubernetes.io/projected/b33de03b-23ec-40c0-b309-0dd2024caf71-kube-api-access-bwvbs\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.050337 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b33de03b-23ec-40c0-b309-0dd2024caf71-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.281289 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" event={"ID":"b33de03b-23ec-40c0-b309-0dd2024caf71","Type":"ContainerDied","Data":"440d34e0cd124a64005a33484aec00463b1a2e180f1850432f84d55112a37b77"} Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.281334 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="440d34e0cd124a64005a33484aec00463b1a2e180f1850432f84d55112a37b77" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.281384 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.381063 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b"] Mar 18 09:33:18 crc kubenswrapper[4778]: E0318 09:33:18.381616 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b33de03b-23ec-40c0-b309-0dd2024caf71" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.381634 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b33de03b-23ec-40c0-b309-0dd2024caf71" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.381831 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b33de03b-23ec-40c0-b309-0dd2024caf71" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.382510 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.385347 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.385495 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.385577 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.387302 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.398037 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b"] Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.467809 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.467865 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.467910 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z7p2\" (UniqueName: \"kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.568878 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.569149 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.569234 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z7p2\" (UniqueName: \"kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.573413 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.575996 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.593442 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z7p2\" (UniqueName: \"kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:18 crc kubenswrapper[4778]: I0318 09:33:18.708969 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:33:19 crc kubenswrapper[4778]: I0318 09:33:19.293403 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b"] Mar 18 09:33:20 crc kubenswrapper[4778]: I0318 09:33:20.303727 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" event={"ID":"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1","Type":"ContainerStarted","Data":"e73cd1a7f2cc415ed8844ab76dbcafbf6d6af8f6453df69ed69f66ec8f314dde"} Mar 18 09:33:20 crc kubenswrapper[4778]: I0318 09:33:20.304054 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" event={"ID":"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1","Type":"ContainerStarted","Data":"0ecf314c0ee29bbd7c8f5498678c43576aa8cd1ac998b8bfc3d0c895c7d8e42f"} Mar 18 09:33:20 crc kubenswrapper[4778]: I0318 09:33:20.320861 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" podStartSLOduration=1.842169702 podStartE2EDuration="2.32081191s" podCreationTimestamp="2026-03-18 09:33:18 +0000 UTC" firstStartedPulling="2026-03-18 09:33:19.306971656 +0000 UTC m=+1865.881716496" lastFinishedPulling="2026-03-18 09:33:19.785613824 +0000 UTC m=+1866.360358704" observedRunningTime="2026-03-18 09:33:20.318731253 +0000 UTC m=+1866.893476093" watchObservedRunningTime="2026-03-18 09:33:20.32081191 +0000 UTC m=+1866.895556750" Mar 18 09:33:23 crc kubenswrapper[4778]: I0318 09:33:23.053818 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-zbghp"] Mar 18 09:33:23 crc kubenswrapper[4778]: I0318 09:33:23.063755 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-zbghp"] Mar 18 09:33:24 crc kubenswrapper[4778]: I0318 09:33:24.204096 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d42905f-c189-4021-834d-f2a81dae5a4a" path="/var/lib/kubelet/pods/4d42905f-c189-4021-834d-f2a81dae5a4a/volumes" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.187338 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:33:29 crc kubenswrapper[4778]: E0318 09:33:29.188223 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.730652 4778 scope.go:117] "RemoveContainer" containerID="70d53574867291895895df87d2a68bed084a005ff2e35622dba06f6dac00a1ee" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.763809 4778 scope.go:117] "RemoveContainer" containerID="8485de1959de5e473a1a0282e19bec9c8061e5419357cbeb799b7cc895e3b146" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.817378 4778 scope.go:117] "RemoveContainer" containerID="76d9700b7eab0fc318bf79deafd901e277918a900858f790ff6f7d08ee5e2133" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.863330 4778 scope.go:117] "RemoveContainer" containerID="abc212acc9fea22cd31581d6e2bb923603370c1ccdef0851c69537c07eedf089" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.917921 4778 scope.go:117] "RemoveContainer" containerID="aa018cf8a109c6b5750c2118b0eb74eb108759f278376c856e84638cf2d31164" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.936474 4778 scope.go:117] "RemoveContainer" containerID="61d521ae036849914a4701bb867f10a55ba71a80cea0eab40620c4a6aa10638d" Mar 18 09:33:29 crc kubenswrapper[4778]: I0318 09:33:29.974134 4778 scope.go:117] "RemoveContainer" containerID="0cbb671a57344b775f4ff6d3749d585e5e115edb6d8d987453203f44ff882ff0" Mar 18 09:33:30 crc kubenswrapper[4778]: I0318 09:33:30.012056 4778 scope.go:117] "RemoveContainer" containerID="8c18edac4f9dbd5b2d5ca7397e82de877b46a7b70e08421fff14ad85201dac7d" Mar 18 09:33:30 crc kubenswrapper[4778]: I0318 09:33:30.046851 4778 scope.go:117] "RemoveContainer" containerID="c1c9a9ac26d842e9f380c2bc10d90467713b713ce0c3ac97f08ea834682773ee" Mar 18 09:33:30 crc kubenswrapper[4778]: I0318 09:33:30.067665 4778 scope.go:117] "RemoveContainer" containerID="1b70618e3e5fc20f170a550bf06d195893c5a61a58727ad242d6881bbcef4e7a" Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.044217 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-98prp"] Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.054429 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2p9jg"] Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.065144 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5drhw"] Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.075260 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-98prp"] Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.085682 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2p9jg"] Mar 18 09:33:37 crc kubenswrapper[4778]: I0318 09:33:37.097222 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5drhw"] Mar 18 09:33:38 crc kubenswrapper[4778]: I0318 09:33:38.197495 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb26926-fc81-4024-a0fa-2363d8703d72" path="/var/lib/kubelet/pods/0fb26926-fc81-4024-a0fa-2363d8703d72/volumes" Mar 18 09:33:38 crc kubenswrapper[4778]: I0318 09:33:38.199714 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20eafe8e-c0b9-4463-bc12-8c0cd0359968" path="/var/lib/kubelet/pods/20eafe8e-c0b9-4463-bc12-8c0cd0359968/volumes" Mar 18 09:33:38 crc kubenswrapper[4778]: I0318 09:33:38.200486 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4135fc20-df28-4f8d-b244-aedd5ed57cc2" path="/var/lib/kubelet/pods/4135fc20-df28-4f8d-b244-aedd5ed57cc2/volumes" Mar 18 09:33:40 crc kubenswrapper[4778]: I0318 09:33:40.187221 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:33:40 crc kubenswrapper[4778]: E0318 09:33:40.187958 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:33:51 crc kubenswrapper[4778]: I0318 09:33:51.049534 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jb4ss"] Mar 18 09:33:51 crc kubenswrapper[4778]: I0318 09:33:51.062163 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jb4ss"] Mar 18 09:33:52 crc kubenswrapper[4778]: I0318 09:33:52.205573 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fba399d9-71ac-41c3-912f-32ccc7fc6190" path="/var/lib/kubelet/pods/fba399d9-71ac-41c3-912f-32ccc7fc6190/volumes" Mar 18 09:33:55 crc kubenswrapper[4778]: I0318 09:33:55.187686 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:33:55 crc kubenswrapper[4778]: E0318 09:33:55.188718 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.157141 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563774-jn4n5"] Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.159217 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.162105 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.166107 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.166252 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.204344 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563774-jn4n5"] Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.309701 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-247bb\" (UniqueName: \"kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb\") pod \"auto-csr-approver-29563774-jn4n5\" (UID: \"315606e3-7197-4234-b672-400a86339d27\") " pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.412288 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-247bb\" (UniqueName: \"kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb\") pod \"auto-csr-approver-29563774-jn4n5\" (UID: \"315606e3-7197-4234-b672-400a86339d27\") " pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.433430 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-247bb\" (UniqueName: \"kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb\") pod \"auto-csr-approver-29563774-jn4n5\" (UID: \"315606e3-7197-4234-b672-400a86339d27\") " pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.482799 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:00 crc kubenswrapper[4778]: I0318 09:34:00.970829 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563774-jn4n5"] Mar 18 09:34:01 crc kubenswrapper[4778]: I0318 09:34:01.718513 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" event={"ID":"315606e3-7197-4234-b672-400a86339d27","Type":"ContainerStarted","Data":"d8212f543f3dcc66d1290434aca3b32ee334c67ba0e61b2406393b9e448aa645"} Mar 18 09:34:02 crc kubenswrapper[4778]: I0318 09:34:02.735575 4778 generic.go:334] "Generic (PLEG): container finished" podID="315606e3-7197-4234-b672-400a86339d27" containerID="3e7ed49b01f49625749fbe5496f4ace13851a2f40c5fbcd9633d28b842edcbb0" exitCode=0 Mar 18 09:34:02 crc kubenswrapper[4778]: I0318 09:34:02.735697 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" event={"ID":"315606e3-7197-4234-b672-400a86339d27","Type":"ContainerDied","Data":"3e7ed49b01f49625749fbe5496f4ace13851a2f40c5fbcd9633d28b842edcbb0"} Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.129193 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.298784 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-247bb\" (UniqueName: \"kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb\") pod \"315606e3-7197-4234-b672-400a86339d27\" (UID: \"315606e3-7197-4234-b672-400a86339d27\") " Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.304975 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb" (OuterVolumeSpecName: "kube-api-access-247bb") pod "315606e3-7197-4234-b672-400a86339d27" (UID: "315606e3-7197-4234-b672-400a86339d27"). InnerVolumeSpecName "kube-api-access-247bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.401353 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-247bb\" (UniqueName: \"kubernetes.io/projected/315606e3-7197-4234-b672-400a86339d27-kube-api-access-247bb\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.757551 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" event={"ID":"315606e3-7197-4234-b672-400a86339d27","Type":"ContainerDied","Data":"d8212f543f3dcc66d1290434aca3b32ee334c67ba0e61b2406393b9e448aa645"} Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.757607 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8212f543f3dcc66d1290434aca3b32ee334c67ba0e61b2406393b9e448aa645" Mar 18 09:34:04 crc kubenswrapper[4778]: I0318 09:34:04.757690 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-jn4n5" Mar 18 09:34:05 crc kubenswrapper[4778]: I0318 09:34:05.189370 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-4z27c"] Mar 18 09:34:05 crc kubenswrapper[4778]: I0318 09:34:05.196085 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-4z27c"] Mar 18 09:34:06 crc kubenswrapper[4778]: I0318 09:34:06.202484 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec16e337-91fc-40c5-b3d4-87b5243e5a73" path="/var/lib/kubelet/pods/ec16e337-91fc-40c5-b3d4-87b5243e5a73/volumes" Mar 18 09:34:06 crc kubenswrapper[4778]: I0318 09:34:06.777578 4778 generic.go:334] "Generic (PLEG): container finished" podID="637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" containerID="e73cd1a7f2cc415ed8844ab76dbcafbf6d6af8f6453df69ed69f66ec8f314dde" exitCode=0 Mar 18 09:34:06 crc kubenswrapper[4778]: I0318 09:34:06.777702 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" event={"ID":"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1","Type":"ContainerDied","Data":"e73cd1a7f2cc415ed8844ab76dbcafbf6d6af8f6453df69ed69f66ec8f314dde"} Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.187325 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:34:08 crc kubenswrapper[4778]: E0318 09:34:08.188087 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.190610 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.378935 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam\") pod \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.379327 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory\") pod \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.379408 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z7p2\" (UniqueName: \"kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2\") pod \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\" (UID: \"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1\") " Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.386825 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2" (OuterVolumeSpecName: "kube-api-access-2z7p2") pod "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" (UID: "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1"). InnerVolumeSpecName "kube-api-access-2z7p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.409828 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory" (OuterVolumeSpecName: "inventory") pod "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" (UID: "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.421468 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" (UID: "637a34d6-3284-4d4f-ab9b-70fd0b4d29b1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.481587 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z7p2\" (UniqueName: \"kubernetes.io/projected/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-kube-api-access-2z7p2\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.481615 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.481628 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.799707 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" event={"ID":"637a34d6-3284-4d4f-ab9b-70fd0b4d29b1","Type":"ContainerDied","Data":"0ecf314c0ee29bbd7c8f5498678c43576aa8cd1ac998b8bfc3d0c895c7d8e42f"} Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.799767 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ecf314c0ee29bbd7c8f5498678c43576aa8cd1ac998b8bfc3d0c895c7d8e42f" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.799787 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.896465 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zd2d7"] Mar 18 09:34:08 crc kubenswrapper[4778]: E0318 09:34:08.897042 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.897071 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:08 crc kubenswrapper[4778]: E0318 09:34:08.897106 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315606e3-7197-4234-b672-400a86339d27" containerName="oc" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.897122 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="315606e3-7197-4234-b672-400a86339d27" containerName="oc" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.897427 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="315606e3-7197-4234-b672-400a86339d27" containerName="oc" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.897475 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.898430 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.900810 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.901786 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.902327 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.902809 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.907084 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zd2d7"] Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.991086 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.991269 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9662\" (UniqueName: \"kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:08 crc kubenswrapper[4778]: I0318 09:34:08.991314 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.093033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9662\" (UniqueName: \"kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.093089 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.093130 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.098189 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.099290 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.120461 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9662\" (UniqueName: \"kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662\") pod \"ssh-known-hosts-edpm-deployment-zd2d7\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.225690 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.792643 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zd2d7"] Mar 18 09:34:09 crc kubenswrapper[4778]: I0318 09:34:09.814146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" event={"ID":"d85dc0a9-a7b0-4715-bc9d-974ac7657337","Type":"ContainerStarted","Data":"97cb197e6ba89a6ea3e936d214db02d302c4c2af7d065d843450578396131b2b"} Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.564695 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.567308 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.593550 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.754621 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.755551 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.755767 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvbfg\" (UniqueName: \"kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.828298 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" event={"ID":"d85dc0a9-a7b0-4715-bc9d-974ac7657337","Type":"ContainerStarted","Data":"40ec8676e1c1ccc56a543cb5a42de5f34fb52e822133f45c8ebee2d755dab39f"} Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.846270 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" podStartSLOduration=3.072908139 podStartE2EDuration="3.846250141s" podCreationTimestamp="2026-03-18 09:34:08 +0000 UTC" firstStartedPulling="2026-03-18 09:34:09.794475837 +0000 UTC m=+1916.369220687" lastFinishedPulling="2026-03-18 09:34:10.567817829 +0000 UTC m=+1917.142562689" observedRunningTime="2026-03-18 09:34:11.842659853 +0000 UTC m=+1918.417404693" watchObservedRunningTime="2026-03-18 09:34:11.846250141 +0000 UTC m=+1918.420994991" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.857418 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.857750 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.857961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvbfg\" (UniqueName: \"kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.858694 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.858877 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.885033 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvbfg\" (UniqueName: \"kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg\") pod \"certified-operators-475gr\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:11 crc kubenswrapper[4778]: I0318 09:34:11.893834 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:12 crc kubenswrapper[4778]: I0318 09:34:12.423491 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:12 crc kubenswrapper[4778]: W0318 09:34:12.432312 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d3beaac_46a9_4ec0_bfd5_ee225f4bb32e.slice/crio-b4fb4c9857fb657a54cf9e724f2ffd94c53e85dc57ae0bc608d2d9e727aba66f WatchSource:0}: Error finding container b4fb4c9857fb657a54cf9e724f2ffd94c53e85dc57ae0bc608d2d9e727aba66f: Status 404 returned error can't find the container with id b4fb4c9857fb657a54cf9e724f2ffd94c53e85dc57ae0bc608d2d9e727aba66f Mar 18 09:34:12 crc kubenswrapper[4778]: I0318 09:34:12.841552 4778 generic.go:334] "Generic (PLEG): container finished" podID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerID="4bac5ae2ef88828781e3d752d0712c190c833bc1c5c36fafcba65e3a7b425e6a" exitCode=0 Mar 18 09:34:12 crc kubenswrapper[4778]: I0318 09:34:12.841674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerDied","Data":"4bac5ae2ef88828781e3d752d0712c190c833bc1c5c36fafcba65e3a7b425e6a"} Mar 18 09:34:12 crc kubenswrapper[4778]: I0318 09:34:12.841904 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerStarted","Data":"b4fb4c9857fb657a54cf9e724f2ffd94c53e85dc57ae0bc608d2d9e727aba66f"} Mar 18 09:34:14 crc kubenswrapper[4778]: I0318 09:34:14.868847 4778 generic.go:334] "Generic (PLEG): container finished" podID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerID="60f6315316fa666e5fb06d88f5b2aedb8a14343e1036e621abc9b1a882b8b80c" exitCode=0 Mar 18 09:34:14 crc kubenswrapper[4778]: I0318 09:34:14.868900 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerDied","Data":"60f6315316fa666e5fb06d88f5b2aedb8a14343e1036e621abc9b1a882b8b80c"} Mar 18 09:34:15 crc kubenswrapper[4778]: I0318 09:34:15.884552 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerStarted","Data":"7ec44f5cc97d7eda5aa101e36380c2c45ff0732b9ec9263200e6e089138517e7"} Mar 18 09:34:15 crc kubenswrapper[4778]: I0318 09:34:15.908616 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-475gr" podStartSLOduration=2.414798714 podStartE2EDuration="4.908592115s" podCreationTimestamp="2026-03-18 09:34:11 +0000 UTC" firstStartedPulling="2026-03-18 09:34:12.844321922 +0000 UTC m=+1919.419066762" lastFinishedPulling="2026-03-18 09:34:15.338115323 +0000 UTC m=+1921.912860163" observedRunningTime="2026-03-18 09:34:15.900791395 +0000 UTC m=+1922.475536245" watchObservedRunningTime="2026-03-18 09:34:15.908592115 +0000 UTC m=+1922.483336955" Mar 18 09:34:17 crc kubenswrapper[4778]: I0318 09:34:17.915075 4778 generic.go:334] "Generic (PLEG): container finished" podID="d85dc0a9-a7b0-4715-bc9d-974ac7657337" containerID="40ec8676e1c1ccc56a543cb5a42de5f34fb52e822133f45c8ebee2d755dab39f" exitCode=0 Mar 18 09:34:17 crc kubenswrapper[4778]: I0318 09:34:17.915266 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" event={"ID":"d85dc0a9-a7b0-4715-bc9d-974ac7657337","Type":"ContainerDied","Data":"40ec8676e1c1ccc56a543cb5a42de5f34fb52e822133f45c8ebee2d755dab39f"} Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.386205 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.418550 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0\") pod \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.418620 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9662\" (UniqueName: \"kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662\") pod \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.418671 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam\") pod \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\" (UID: \"d85dc0a9-a7b0-4715-bc9d-974ac7657337\") " Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.460664 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662" (OuterVolumeSpecName: "kube-api-access-g9662") pod "d85dc0a9-a7b0-4715-bc9d-974ac7657337" (UID: "d85dc0a9-a7b0-4715-bc9d-974ac7657337"). InnerVolumeSpecName "kube-api-access-g9662". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.462618 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d85dc0a9-a7b0-4715-bc9d-974ac7657337" (UID: "d85dc0a9-a7b0-4715-bc9d-974ac7657337"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.464132 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d85dc0a9-a7b0-4715-bc9d-974ac7657337" (UID: "d85dc0a9-a7b0-4715-bc9d-974ac7657337"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.520190 4778 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.520243 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9662\" (UniqueName: \"kubernetes.io/projected/d85dc0a9-a7b0-4715-bc9d-974ac7657337-kube-api-access-g9662\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.520257 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d85dc0a9-a7b0-4715-bc9d-974ac7657337-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.941023 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" event={"ID":"d85dc0a9-a7b0-4715-bc9d-974ac7657337","Type":"ContainerDied","Data":"97cb197e6ba89a6ea3e936d214db02d302c4c2af7d065d843450578396131b2b"} Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.941068 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97cb197e6ba89a6ea3e936d214db02d302c4c2af7d065d843450578396131b2b" Mar 18 09:34:19 crc kubenswrapper[4778]: I0318 09:34:19.941088 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zd2d7" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.021392 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt"] Mar 18 09:34:20 crc kubenswrapper[4778]: E0318 09:34:20.021815 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85dc0a9-a7b0-4715-bc9d-974ac7657337" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.021835 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85dc0a9-a7b0-4715-bc9d-974ac7657337" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.022058 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d85dc0a9-a7b0-4715-bc9d-974ac7657337" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.022768 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.026401 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.026491 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.026868 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.028104 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.039021 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt"] Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.129792 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdcqp\" (UniqueName: \"kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.129901 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.129976 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.231126 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdcqp\" (UniqueName: \"kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.231228 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.231299 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.244260 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.244621 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.250228 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdcqp\" (UniqueName: \"kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4rmrt\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.343528 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.924905 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt"] Mar 18 09:34:20 crc kubenswrapper[4778]: I0318 09:34:20.953030 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" event={"ID":"0b7ef620-077d-4a38-90ed-fed05ccba5d2","Type":"ContainerStarted","Data":"8822c7df21f30bb6af9e1f11ae0a03462e98cfd9b7aec393cc5e0a76ed0fa331"} Mar 18 09:34:21 crc kubenswrapper[4778]: I0318 09:34:21.894156 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:21 crc kubenswrapper[4778]: I0318 09:34:21.894533 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:21 crc kubenswrapper[4778]: I0318 09:34:21.948844 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:21 crc kubenswrapper[4778]: I0318 09:34:21.968926 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" event={"ID":"0b7ef620-077d-4a38-90ed-fed05ccba5d2","Type":"ContainerStarted","Data":"16287400dde13e1c43e43b5d82fb77152a697e54c70364b3f589e427eeb0ea66"} Mar 18 09:34:22 crc kubenswrapper[4778]: I0318 09:34:22.015354 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" podStartSLOduration=1.57121584 podStartE2EDuration="2.015336214s" podCreationTimestamp="2026-03-18 09:34:20 +0000 UTC" firstStartedPulling="2026-03-18 09:34:20.937302217 +0000 UTC m=+1927.512047057" lastFinishedPulling="2026-03-18 09:34:21.381422581 +0000 UTC m=+1927.956167431" observedRunningTime="2026-03-18 09:34:22.008435138 +0000 UTC m=+1928.583179998" watchObservedRunningTime="2026-03-18 09:34:22.015336214 +0000 UTC m=+1928.590081064" Mar 18 09:34:22 crc kubenswrapper[4778]: I0318 09:34:22.032113 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:22 crc kubenswrapper[4778]: I0318 09:34:22.188058 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:34:22 crc kubenswrapper[4778]: E0318 09:34:22.188381 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:34:24 crc kubenswrapper[4778]: I0318 09:34:24.318172 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:24 crc kubenswrapper[4778]: I0318 09:34:24.319108 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-475gr" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="registry-server" containerID="cri-o://7ec44f5cc97d7eda5aa101e36380c2c45ff0732b9ec9263200e6e089138517e7" gracePeriod=2 Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.003594 4778 generic.go:334] "Generic (PLEG): container finished" podID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerID="7ec44f5cc97d7eda5aa101e36380c2c45ff0732b9ec9263200e6e089138517e7" exitCode=0 Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.003867 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerDied","Data":"7ec44f5cc97d7eda5aa101e36380c2c45ff0732b9ec9263200e6e089138517e7"} Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.271578 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.340516 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvbfg\" (UniqueName: \"kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg\") pod \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.340640 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities\") pod \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.340669 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content\") pod \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\" (UID: \"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e\") " Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.341894 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities" (OuterVolumeSpecName: "utilities") pod "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" (UID: "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.346679 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg" (OuterVolumeSpecName: "kube-api-access-nvbfg") pod "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" (UID: "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e"). InnerVolumeSpecName "kube-api-access-nvbfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.393627 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" (UID: "6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.441949 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.441979 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:25 crc kubenswrapper[4778]: I0318 09:34:25.441990 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvbfg\" (UniqueName: \"kubernetes.io/projected/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e-kube-api-access-nvbfg\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.015970 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-475gr" event={"ID":"6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e","Type":"ContainerDied","Data":"b4fb4c9857fb657a54cf9e724f2ffd94c53e85dc57ae0bc608d2d9e727aba66f"} Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.016281 4778 scope.go:117] "RemoveContainer" containerID="7ec44f5cc97d7eda5aa101e36380c2c45ff0732b9ec9263200e6e089138517e7" Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.016040 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-475gr" Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.035160 4778 scope.go:117] "RemoveContainer" containerID="60f6315316fa666e5fb06d88f5b2aedb8a14343e1036e621abc9b1a882b8b80c" Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.055429 4778 scope.go:117] "RemoveContainer" containerID="4bac5ae2ef88828781e3d752d0712c190c833bc1c5c36fafcba65e3a7b425e6a" Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.059161 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.071350 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-475gr"] Mar 18 09:34:26 crc kubenswrapper[4778]: I0318 09:34:26.200841 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" path="/var/lib/kubelet/pods/6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e/volumes" Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.057341 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" event={"ID":"0b7ef620-077d-4a38-90ed-fed05ccba5d2","Type":"ContainerDied","Data":"16287400dde13e1c43e43b5d82fb77152a697e54c70364b3f589e427eeb0ea66"} Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.057326 4778 generic.go:334] "Generic (PLEG): container finished" podID="0b7ef620-077d-4a38-90ed-fed05ccba5d2" containerID="16287400dde13e1c43e43b5d82fb77152a697e54c70364b3f589e427eeb0ea66" exitCode=0 Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.239285 4778 scope.go:117] "RemoveContainer" containerID="201dd8b3293289bdbf9f29c3749f98499b07694d8d80e9df99ed62c3075ec93f" Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.303490 4778 scope.go:117] "RemoveContainer" containerID="a9c5d9789a0793c932c41c72d2058c14c6dc506dbd6a4bb8ed76c0353ce8bcc2" Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.351839 4778 scope.go:117] "RemoveContainer" containerID="b5325ab7bf5fcc801abe0c67c554c5ab72e440e7503aafe03c45a398c6a12432" Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.441679 4778 scope.go:117] "RemoveContainer" containerID="43cd9dd19c1bcb6dd251052b9f5e2f4fd14b11fc891320649bdbfdb44d9ca171" Mar 18 09:34:30 crc kubenswrapper[4778]: I0318 09:34:30.471598 4778 scope.go:117] "RemoveContainer" containerID="f3f45a8c98da9f26aefaed6f12fcf1ccfeb1c540f04357427bf8f13c5c12ad79" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.588414 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.655972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdcqp\" (UniqueName: \"kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp\") pod \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.656127 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory\") pod \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.656340 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam\") pod \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\" (UID: \"0b7ef620-077d-4a38-90ed-fed05ccba5d2\") " Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.677515 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp" (OuterVolumeSpecName: "kube-api-access-vdcqp") pod "0b7ef620-077d-4a38-90ed-fed05ccba5d2" (UID: "0b7ef620-077d-4a38-90ed-fed05ccba5d2"). InnerVolumeSpecName "kube-api-access-vdcqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.688356 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory" (OuterVolumeSpecName: "inventory") pod "0b7ef620-077d-4a38-90ed-fed05ccba5d2" (UID: "0b7ef620-077d-4a38-90ed-fed05ccba5d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.695474 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b7ef620-077d-4a38-90ed-fed05ccba5d2" (UID: "0b7ef620-077d-4a38-90ed-fed05ccba5d2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.758778 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.758831 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdcqp\" (UniqueName: \"kubernetes.io/projected/0b7ef620-077d-4a38-90ed-fed05ccba5d2-kube-api-access-vdcqp\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:31 crc kubenswrapper[4778]: I0318 09:34:31.758842 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7ef620-077d-4a38-90ed-fed05ccba5d2-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.086525 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" event={"ID":"0b7ef620-077d-4a38-90ed-fed05ccba5d2","Type":"ContainerDied","Data":"8822c7df21f30bb6af9e1f11ae0a03462e98cfd9b7aec393cc5e0a76ed0fa331"} Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.086650 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8822c7df21f30bb6af9e1f11ae0a03462e98cfd9b7aec393cc5e0a76ed0fa331" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.086968 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.235638 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2"] Mar 18 09:34:32 crc kubenswrapper[4778]: E0318 09:34:32.236051 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7ef620-077d-4a38-90ed-fed05ccba5d2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236068 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7ef620-077d-4a38-90ed-fed05ccba5d2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:32 crc kubenswrapper[4778]: E0318 09:34:32.236085 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="extract-content" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236091 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="extract-content" Mar 18 09:34:32 crc kubenswrapper[4778]: E0318 09:34:32.236110 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="registry-server" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236116 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="registry-server" Mar 18 09:34:32 crc kubenswrapper[4778]: E0318 09:34:32.236136 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="extract-utilities" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236143 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="extract-utilities" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236328 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7ef620-077d-4a38-90ed-fed05ccba5d2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236341 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3beaac-46a9-4ec0-bfd5-ee225f4bb32e" containerName="registry-server" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.236938 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.239013 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.239234 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.239722 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.241680 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.245048 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2"] Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.371917 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79w68\" (UniqueName: \"kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.372138 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.372241 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.474752 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.474845 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.474941 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79w68\" (UniqueName: \"kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.480090 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.480666 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.497461 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79w68\" (UniqueName: \"kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:32 crc kubenswrapper[4778]: I0318 09:34:32.553008 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:33 crc kubenswrapper[4778]: I0318 09:34:33.158782 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2"] Mar 18 09:34:34 crc kubenswrapper[4778]: I0318 09:34:34.106473 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" event={"ID":"bf3315b0-1ace-422a-8049-3fd13fe46e65","Type":"ContainerStarted","Data":"ae502ee49eb38287e9aaaa3ab0077cb1ef93b09b02a84b50a76e8fa209d6ad0c"} Mar 18 09:34:34 crc kubenswrapper[4778]: I0318 09:34:34.106595 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" event={"ID":"bf3315b0-1ace-422a-8049-3fd13fe46e65","Type":"ContainerStarted","Data":"b6b8f26a067a2e81c9bf97aaee67a21d858c099d81a194c710b247b4bbc30b19"} Mar 18 09:34:34 crc kubenswrapper[4778]: I0318 09:34:34.136985 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" podStartSLOduration=1.7060892669999999 podStartE2EDuration="2.136958715s" podCreationTimestamp="2026-03-18 09:34:32 +0000 UTC" firstStartedPulling="2026-03-18 09:34:33.157267619 +0000 UTC m=+1939.732012459" lastFinishedPulling="2026-03-18 09:34:33.588137057 +0000 UTC m=+1940.162881907" observedRunningTime="2026-03-18 09:34:34.128798665 +0000 UTC m=+1940.703543535" watchObservedRunningTime="2026-03-18 09:34:34.136958715 +0000 UTC m=+1940.711703595" Mar 18 09:34:35 crc kubenswrapper[4778]: I0318 09:34:35.188804 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:34:36 crc kubenswrapper[4778]: I0318 09:34:36.133414 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225"} Mar 18 09:34:42 crc kubenswrapper[4778]: I0318 09:34:42.062314 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-nl2dg"] Mar 18 09:34:42 crc kubenswrapper[4778]: I0318 09:34:42.074669 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-nl2dg"] Mar 18 09:34:42 crc kubenswrapper[4778]: I0318 09:34:42.209495 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8341ceba-13e0-410f-a7d2-23190a07d914" path="/var/lib/kubelet/pods/8341ceba-13e0-410f-a7d2-23190a07d914/volumes" Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.068610 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-922c-account-create-update-6z2xf"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.087719 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9hlqk"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.104214 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-t5x58"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.113736 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9hlqk"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.124103 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-722f-account-create-update-slwd5"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.133885 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-922c-account-create-update-6z2xf"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.141291 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b9f9-account-create-update-w7q2n"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.152894 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-t5x58"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.162130 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-722f-account-create-update-slwd5"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.171253 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b9f9-account-create-update-w7q2n"] Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.200108 4778 generic.go:334] "Generic (PLEG): container finished" podID="bf3315b0-1ace-422a-8049-3fd13fe46e65" containerID="ae502ee49eb38287e9aaaa3ab0077cb1ef93b09b02a84b50a76e8fa209d6ad0c" exitCode=0 Mar 18 09:34:43 crc kubenswrapper[4778]: I0318 09:34:43.200138 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" event={"ID":"bf3315b0-1ace-422a-8049-3fd13fe46e65","Type":"ContainerDied","Data":"ae502ee49eb38287e9aaaa3ab0077cb1ef93b09b02a84b50a76e8fa209d6ad0c"} Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.224374 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f06b776-36bc-45ba-88d4-69608f9665e6" path="/var/lib/kubelet/pods/2f06b776-36bc-45ba-88d4-69608f9665e6/volumes" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.227490 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d" path="/var/lib/kubelet/pods/2f1e5923-f2dc-40e4-a8b0-9450cdb3f68d/volumes" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.228292 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92444732-2d3e-4065-a336-74b37b711530" path="/var/lib/kubelet/pods/92444732-2d3e-4065-a336-74b37b711530/volumes" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.228831 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b380dfb3-b55b-4db2-bd8f-a90b4470345d" path="/var/lib/kubelet/pods/b380dfb3-b55b-4db2-bd8f-a90b4470345d/volumes" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.229841 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e" path="/var/lib/kubelet/pods/b6a91589-b8d7-4fd8-acb6-5c605b3b0c5e/volumes" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.609457 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.687358 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam\") pod \"bf3315b0-1ace-422a-8049-3fd13fe46e65\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.687555 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79w68\" (UniqueName: \"kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68\") pod \"bf3315b0-1ace-422a-8049-3fd13fe46e65\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.687614 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory\") pod \"bf3315b0-1ace-422a-8049-3fd13fe46e65\" (UID: \"bf3315b0-1ace-422a-8049-3fd13fe46e65\") " Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.699953 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68" (OuterVolumeSpecName: "kube-api-access-79w68") pod "bf3315b0-1ace-422a-8049-3fd13fe46e65" (UID: "bf3315b0-1ace-422a-8049-3fd13fe46e65"). InnerVolumeSpecName "kube-api-access-79w68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.715181 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf3315b0-1ace-422a-8049-3fd13fe46e65" (UID: "bf3315b0-1ace-422a-8049-3fd13fe46e65"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.723189 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory" (OuterVolumeSpecName: "inventory") pod "bf3315b0-1ace-422a-8049-3fd13fe46e65" (UID: "bf3315b0-1ace-422a-8049-3fd13fe46e65"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.789640 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79w68\" (UniqueName: \"kubernetes.io/projected/bf3315b0-1ace-422a-8049-3fd13fe46e65-kube-api-access-79w68\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.789677 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:44 crc kubenswrapper[4778]: I0318 09:34:44.789690 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf3315b0-1ace-422a-8049-3fd13fe46e65-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:45 crc kubenswrapper[4778]: I0318 09:34:45.239629 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" event={"ID":"bf3315b0-1ace-422a-8049-3fd13fe46e65","Type":"ContainerDied","Data":"b6b8f26a067a2e81c9bf97aaee67a21d858c099d81a194c710b247b4bbc30b19"} Mar 18 09:34:45 crc kubenswrapper[4778]: I0318 09:34:45.240135 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b8f26a067a2e81c9bf97aaee67a21d858c099d81a194c710b247b4bbc30b19" Mar 18 09:34:45 crc kubenswrapper[4778]: I0318 09:34:45.239927 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2" Mar 18 09:35:12 crc kubenswrapper[4778]: I0318 09:35:12.036623 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vhxwz"] Mar 18 09:35:12 crc kubenswrapper[4778]: I0318 09:35:12.047662 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vhxwz"] Mar 18 09:35:12 crc kubenswrapper[4778]: I0318 09:35:12.202366 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8348daa3-112d-49f7-93d8-3649ebf10eee" path="/var/lib/kubelet/pods/8348daa3-112d-49f7-93d8-3649ebf10eee/volumes" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.657375 4778 scope.go:117] "RemoveContainer" containerID="b0ad59dfbfbe8f98b2a7024fc11350f06ab712f37850bffb7121c440c9344960" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.699059 4778 scope.go:117] "RemoveContainer" containerID="aed5c2d54c93258cf5753b658cc8a1430cb39fb4faca41384d49dcc12f51df37" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.764310 4778 scope.go:117] "RemoveContainer" containerID="9866f0cece8384eb6d69125fd4f2648001a15f8207d97598a6f6b380c668253f" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.783503 4778 scope.go:117] "RemoveContainer" containerID="3cc34e35f2db07df2220b6c334d24c112405b578f89727d873a592536bc78998" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.832227 4778 scope.go:117] "RemoveContainer" containerID="47bfce503465075386d4ab81517eb08824a50d2ca76a4ab55639a7aea5948d36" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.871983 4778 scope.go:117] "RemoveContainer" containerID="c118c28760c4816bb842a36e485ff938333b6ae9902cf9242267aa191e3d70bf" Mar 18 09:35:30 crc kubenswrapper[4778]: I0318 09:35:30.908611 4778 scope.go:117] "RemoveContainer" containerID="7fb36f99fa48f9c60dbdcb8445fed2d769e9cb712ffc10c71b7ff46632229d69" Mar 18 09:35:34 crc kubenswrapper[4778]: I0318 09:35:34.051556 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mjs29"] Mar 18 09:35:34 crc kubenswrapper[4778]: I0318 09:35:34.059277 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mjs29"] Mar 18 09:35:34 crc kubenswrapper[4778]: I0318 09:35:34.198385 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b89439e3-a138-4aa8-98a4-2e23ce3819e0" path="/var/lib/kubelet/pods/b89439e3-a138-4aa8-98a4-2e23ce3819e0/volumes" Mar 18 09:35:35 crc kubenswrapper[4778]: I0318 09:35:35.082810 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jbjb9"] Mar 18 09:35:35 crc kubenswrapper[4778]: I0318 09:35:35.107443 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jbjb9"] Mar 18 09:35:36 crc kubenswrapper[4778]: I0318 09:35:36.198917 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85d64a6-99af-4b66-9a60-cd6a046af840" path="/var/lib/kubelet/pods/e85d64a6-99af-4b66-9a60-cd6a046af840/volumes" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.156960 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563776-bxwcm"] Mar 18 09:36:00 crc kubenswrapper[4778]: E0318 09:36:00.157965 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3315b0-1ace-422a-8049-3fd13fe46e65" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.157979 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3315b0-1ace-422a-8049-3fd13fe46e65" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.158166 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3315b0-1ace-422a-8049-3fd13fe46e65" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.158923 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.162805 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.162896 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.164184 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.164879 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563776-bxwcm"] Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.344977 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbr95\" (UniqueName: \"kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95\") pod \"auto-csr-approver-29563776-bxwcm\" (UID: \"b14b14c0-2e4e-420d-bdba-234de9130e4a\") " pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.446954 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbr95\" (UniqueName: \"kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95\") pod \"auto-csr-approver-29563776-bxwcm\" (UID: \"b14b14c0-2e4e-420d-bdba-234de9130e4a\") " pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.482689 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbr95\" (UniqueName: \"kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95\") pod \"auto-csr-approver-29563776-bxwcm\" (UID: \"b14b14c0-2e4e-420d-bdba-234de9130e4a\") " pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:00 crc kubenswrapper[4778]: I0318 09:36:00.781826 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:01 crc kubenswrapper[4778]: I0318 09:36:01.244998 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563776-bxwcm"] Mar 18 09:36:02 crc kubenswrapper[4778]: I0318 09:36:02.097292 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" event={"ID":"b14b14c0-2e4e-420d-bdba-234de9130e4a","Type":"ContainerStarted","Data":"b19d1bc12e03cd0b9cf24c232aaef04916608d6f63edcf7eb54f530ee75b9ccd"} Mar 18 09:36:04 crc kubenswrapper[4778]: I0318 09:36:04.123436 4778 generic.go:334] "Generic (PLEG): container finished" podID="b14b14c0-2e4e-420d-bdba-234de9130e4a" containerID="037bb0f9fdf9935b25af1bbd8db6391c200ce1a888406ad48350f6fbf2f0253c" exitCode=0 Mar 18 09:36:04 crc kubenswrapper[4778]: I0318 09:36:04.123523 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" event={"ID":"b14b14c0-2e4e-420d-bdba-234de9130e4a","Type":"ContainerDied","Data":"037bb0f9fdf9935b25af1bbd8db6391c200ce1a888406ad48350f6fbf2f0253c"} Mar 18 09:36:05 crc kubenswrapper[4778]: I0318 09:36:05.555871 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:05 crc kubenswrapper[4778]: I0318 09:36:05.751281 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbr95\" (UniqueName: \"kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95\") pod \"b14b14c0-2e4e-420d-bdba-234de9130e4a\" (UID: \"b14b14c0-2e4e-420d-bdba-234de9130e4a\") " Mar 18 09:36:05 crc kubenswrapper[4778]: I0318 09:36:05.757942 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95" (OuterVolumeSpecName: "kube-api-access-bbr95") pod "b14b14c0-2e4e-420d-bdba-234de9130e4a" (UID: "b14b14c0-2e4e-420d-bdba-234de9130e4a"). InnerVolumeSpecName "kube-api-access-bbr95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:36:05 crc kubenswrapper[4778]: I0318 09:36:05.853524 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbr95\" (UniqueName: \"kubernetes.io/projected/b14b14c0-2e4e-420d-bdba-234de9130e4a-kube-api-access-bbr95\") on node \"crc\" DevicePath \"\"" Mar 18 09:36:06 crc kubenswrapper[4778]: I0318 09:36:06.144731 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" event={"ID":"b14b14c0-2e4e-420d-bdba-234de9130e4a","Type":"ContainerDied","Data":"b19d1bc12e03cd0b9cf24c232aaef04916608d6f63edcf7eb54f530ee75b9ccd"} Mar 18 09:36:06 crc kubenswrapper[4778]: I0318 09:36:06.145024 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b19d1bc12e03cd0b9cf24c232aaef04916608d6f63edcf7eb54f530ee75b9ccd" Mar 18 09:36:06 crc kubenswrapper[4778]: I0318 09:36:06.144781 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563776-bxwcm" Mar 18 09:36:06 crc kubenswrapper[4778]: I0318 09:36:06.695788 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563770-5f9th"] Mar 18 09:36:06 crc kubenswrapper[4778]: I0318 09:36:06.705254 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563770-5f9th"] Mar 18 09:36:08 crc kubenswrapper[4778]: I0318 09:36:08.201387 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5229065-e84e-4d42-870f-1ee468bff359" path="/var/lib/kubelet/pods/d5229065-e84e-4d42-870f-1ee468bff359/volumes" Mar 18 09:36:20 crc kubenswrapper[4778]: I0318 09:36:20.047534 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-2nml6"] Mar 18 09:36:20 crc kubenswrapper[4778]: I0318 09:36:20.060328 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-2nml6"] Mar 18 09:36:20 crc kubenswrapper[4778]: I0318 09:36:20.200858 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32eb800e-69e8-4e39-ae5b-74a5eec87b00" path="/var/lib/kubelet/pods/32eb800e-69e8-4e39-ae5b-74a5eec87b00/volumes" Mar 18 09:36:31 crc kubenswrapper[4778]: I0318 09:36:31.045810 4778 scope.go:117] "RemoveContainer" containerID="973e9f8f665d67a226625f9e044e9c18b31cbfecdc6ee8dcf02562081d63ced4" Mar 18 09:36:31 crc kubenswrapper[4778]: I0318 09:36:31.107611 4778 scope.go:117] "RemoveContainer" containerID="8faf9c7a656879007008e10d6b7f5d22a002ddd8fac9065c9f561e0e336487fd" Mar 18 09:36:31 crc kubenswrapper[4778]: I0318 09:36:31.163701 4778 scope.go:117] "RemoveContainer" containerID="3858148ddf213daa44ce8f206664d3360023f6d6f91e24bcfce11a24c0f0213c" Mar 18 09:36:31 crc kubenswrapper[4778]: I0318 09:36:31.219935 4778 scope.go:117] "RemoveContainer" containerID="462bde6149a0c02e0a81e9d8cf7097470bbd3546789cdc6d2d61c3177437187e" Mar 18 09:37:00 crc kubenswrapper[4778]: I0318 09:37:00.147667 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:37:00 crc kubenswrapper[4778]: I0318 09:37:00.149553 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:37:30 crc kubenswrapper[4778]: I0318 09:37:30.148174 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:37:30 crc kubenswrapper[4778]: I0318 09:37:30.149089 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.352652 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:37:49 crc kubenswrapper[4778]: E0318 09:37:49.354132 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14b14c0-2e4e-420d-bdba-234de9130e4a" containerName="oc" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.354269 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14b14c0-2e4e-420d-bdba-234de9130e4a" containerName="oc" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.354485 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b14b14c0-2e4e-420d-bdba-234de9130e4a" containerName="oc" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.356260 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.367265 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.475312 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.475415 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.475507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5f78\" (UniqueName: \"kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.577444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5f78\" (UniqueName: \"kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.577521 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.577578 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.577995 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.578170 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.599513 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5f78\" (UniqueName: \"kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78\") pod \"redhat-marketplace-k7dkw\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:49 crc kubenswrapper[4778]: I0318 09:37:49.703159 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:50 crc kubenswrapper[4778]: I0318 09:37:50.155003 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:37:50 crc kubenswrapper[4778]: I0318 09:37:50.773857 4778 generic.go:334] "Generic (PLEG): container finished" podID="eea5e508-8702-4085-b99d-43524ffd7dac" containerID="d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee" exitCode=0 Mar 18 09:37:50 crc kubenswrapper[4778]: I0318 09:37:50.774027 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerDied","Data":"d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee"} Mar 18 09:37:50 crc kubenswrapper[4778]: I0318 09:37:50.774245 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerStarted","Data":"e10dc855312d99da1bc5c5f66a6b1da1b05bc4c76ebbe2385a7e6ad29264fe63"} Mar 18 09:37:50 crc kubenswrapper[4778]: I0318 09:37:50.776877 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:37:51 crc kubenswrapper[4778]: I0318 09:37:51.783584 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerStarted","Data":"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45"} Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.363358 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.366281 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.394279 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.434641 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.434852 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.435179 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmv8p\" (UniqueName: \"kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.536525 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.536624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.536736 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmv8p\" (UniqueName: \"kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.537640 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.537919 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.563589 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmv8p\" (UniqueName: \"kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p\") pod \"community-operators-6v22l\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.690054 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.814294 4778 generic.go:334] "Generic (PLEG): container finished" podID="eea5e508-8702-4085-b99d-43524ffd7dac" containerID="ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45" exitCode=0 Mar 18 09:37:52 crc kubenswrapper[4778]: I0318 09:37:52.814341 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerDied","Data":"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45"} Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.249592 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:37:53 crc kubenswrapper[4778]: W0318 09:37:53.261756 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d79e7e2_9db4_4307_8411_18b74d60b1b7.slice/crio-19c45c0d8cb539fb9cbc49d484d0d217fa556329caaf0c5f55191f47880349c3 WatchSource:0}: Error finding container 19c45c0d8cb539fb9cbc49d484d0d217fa556329caaf0c5f55191f47880349c3: Status 404 returned error can't find the container with id 19c45c0d8cb539fb9cbc49d484d0d217fa556329caaf0c5f55191f47880349c3 Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.833350 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerStarted","Data":"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1"} Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.836562 4778 generic.go:334] "Generic (PLEG): container finished" podID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerID="4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d" exitCode=0 Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.836595 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerDied","Data":"4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d"} Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.836615 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerStarted","Data":"19c45c0d8cb539fb9cbc49d484d0d217fa556329caaf0c5f55191f47880349c3"} Mar 18 09:37:53 crc kubenswrapper[4778]: I0318 09:37:53.870936 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k7dkw" podStartSLOduration=2.324533106 podStartE2EDuration="4.870909439s" podCreationTimestamp="2026-03-18 09:37:49 +0000 UTC" firstStartedPulling="2026-03-18 09:37:50.776417273 +0000 UTC m=+2137.351162113" lastFinishedPulling="2026-03-18 09:37:53.322793606 +0000 UTC m=+2139.897538446" observedRunningTime="2026-03-18 09:37:53.857743042 +0000 UTC m=+2140.432487932" watchObservedRunningTime="2026-03-18 09:37:53.870909439 +0000 UTC m=+2140.445654299" Mar 18 09:37:54 crc kubenswrapper[4778]: I0318 09:37:54.851167 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerStarted","Data":"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708"} Mar 18 09:37:55 crc kubenswrapper[4778]: I0318 09:37:55.862659 4778 generic.go:334] "Generic (PLEG): container finished" podID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerID="5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708" exitCode=0 Mar 18 09:37:55 crc kubenswrapper[4778]: I0318 09:37:55.862763 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerDied","Data":"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708"} Mar 18 09:37:56 crc kubenswrapper[4778]: I0318 09:37:56.878367 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerStarted","Data":"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940"} Mar 18 09:37:56 crc kubenswrapper[4778]: I0318 09:37:56.914082 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6v22l" podStartSLOduration=2.463222631 podStartE2EDuration="4.914052181s" podCreationTimestamp="2026-03-18 09:37:52 +0000 UTC" firstStartedPulling="2026-03-18 09:37:53.839064216 +0000 UTC m=+2140.413809066" lastFinishedPulling="2026-03-18 09:37:56.289893766 +0000 UTC m=+2142.864638616" observedRunningTime="2026-03-18 09:37:56.903105834 +0000 UTC m=+2143.477850744" watchObservedRunningTime="2026-03-18 09:37:56.914052181 +0000 UTC m=+2143.488797051" Mar 18 09:37:59 crc kubenswrapper[4778]: I0318 09:37:59.704168 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:59 crc kubenswrapper[4778]: I0318 09:37:59.704263 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:59 crc kubenswrapper[4778]: I0318 09:37:59.783394 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:37:59 crc kubenswrapper[4778]: I0318 09:37:59.971163 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.147970 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563778-4dm5p"] Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.148420 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.148596 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.149354 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.149740 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.150082 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.150144 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225" gracePeriod=600 Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.156963 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.157123 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.157389 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.164286 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563778-4dm5p"] Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.284553 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49pq9\" (UniqueName: \"kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9\") pod \"auto-csr-approver-29563778-4dm5p\" (UID: \"0548485b-4f03-47ba-8a13-4e3522451291\") " pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.389534 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49pq9\" (UniqueName: \"kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9\") pod \"auto-csr-approver-29563778-4dm5p\" (UID: \"0548485b-4f03-47ba-8a13-4e3522451291\") " pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.428066 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49pq9\" (UniqueName: \"kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9\") pod \"auto-csr-approver-29563778-4dm5p\" (UID: \"0548485b-4f03-47ba-8a13-4e3522451291\") " pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.472814 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.777231 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563778-4dm5p"] Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.916940 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225" exitCode=0 Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.917003 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225"} Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.917042 4778 scope.go:117] "RemoveContainer" containerID="aa46483b2b20fe2c164915f11a6d30e1e11da9f2f18ea91003cfd49b9e8a5492" Mar 18 09:38:00 crc kubenswrapper[4778]: I0318 09:38:00.919743 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" event={"ID":"0548485b-4f03-47ba-8a13-4e3522451291","Type":"ContainerStarted","Data":"8465549329bfd92b52a0d5ceef063d0f55bdc478460f4cbdf9d413424cbc4cc5"} Mar 18 09:38:01 crc kubenswrapper[4778]: I0318 09:38:01.139683 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:38:01 crc kubenswrapper[4778]: I0318 09:38:01.934586 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9"} Mar 18 09:38:01 crc kubenswrapper[4778]: I0318 09:38:01.934726 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k7dkw" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="registry-server" containerID="cri-o://a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1" gracePeriod=2 Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.397262 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.531343 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities\") pod \"eea5e508-8702-4085-b99d-43524ffd7dac\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.531510 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5f78\" (UniqueName: \"kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78\") pod \"eea5e508-8702-4085-b99d-43524ffd7dac\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.531566 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content\") pod \"eea5e508-8702-4085-b99d-43524ffd7dac\" (UID: \"eea5e508-8702-4085-b99d-43524ffd7dac\") " Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.533594 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities" (OuterVolumeSpecName: "utilities") pod "eea5e508-8702-4085-b99d-43524ffd7dac" (UID: "eea5e508-8702-4085-b99d-43524ffd7dac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.540743 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78" (OuterVolumeSpecName: "kube-api-access-c5f78") pod "eea5e508-8702-4085-b99d-43524ffd7dac" (UID: "eea5e508-8702-4085-b99d-43524ffd7dac"). InnerVolumeSpecName "kube-api-access-c5f78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.584724 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eea5e508-8702-4085-b99d-43524ffd7dac" (UID: "eea5e508-8702-4085-b99d-43524ffd7dac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.633740 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.633785 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5f78\" (UniqueName: \"kubernetes.io/projected/eea5e508-8702-4085-b99d-43524ffd7dac-kube-api-access-c5f78\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.633797 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea5e508-8702-4085-b99d-43524ffd7dac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.691136 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.691439 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.749774 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.952272 4778 generic.go:334] "Generic (PLEG): container finished" podID="0548485b-4f03-47ba-8a13-4e3522451291" containerID="f6adcd9d5f24124681eed0d00263f7ac4a19be40ad724c067b9849cb1ce141e4" exitCode=0 Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.953288 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" event={"ID":"0548485b-4f03-47ba-8a13-4e3522451291","Type":"ContainerDied","Data":"f6adcd9d5f24124681eed0d00263f7ac4a19be40ad724c067b9849cb1ce141e4"} Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.956427 4778 generic.go:334] "Generic (PLEG): container finished" podID="eea5e508-8702-4085-b99d-43524ffd7dac" containerID="a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1" exitCode=0 Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.957595 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dkw" Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.962570 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerDied","Data":"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1"} Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.962644 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dkw" event={"ID":"eea5e508-8702-4085-b99d-43524ffd7dac","Type":"ContainerDied","Data":"e10dc855312d99da1bc5c5f66a6b1da1b05bc4c76ebbe2385a7e6ad29264fe63"} Mar 18 09:38:02 crc kubenswrapper[4778]: I0318 09:38:02.962675 4778 scope.go:117] "RemoveContainer" containerID="a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.000837 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.007138 4778 scope.go:117] "RemoveContainer" containerID="ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.011280 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dkw"] Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.011612 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.034050 4778 scope.go:117] "RemoveContainer" containerID="d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.067322 4778 scope.go:117] "RemoveContainer" containerID="a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1" Mar 18 09:38:03 crc kubenswrapper[4778]: E0318 09:38:03.067716 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1\": container with ID starting with a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1 not found: ID does not exist" containerID="a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.067828 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1"} err="failed to get container status \"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1\": rpc error: code = NotFound desc = could not find container \"a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1\": container with ID starting with a2e2d57e6a15ccca7ae02572ac54c10276744e0d7f2e29914786d79e1d66b7e1 not found: ID does not exist" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.067909 4778 scope.go:117] "RemoveContainer" containerID="ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45" Mar 18 09:38:03 crc kubenswrapper[4778]: E0318 09:38:03.068222 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45\": container with ID starting with ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45 not found: ID does not exist" containerID="ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.068412 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45"} err="failed to get container status \"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45\": rpc error: code = NotFound desc = could not find container \"ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45\": container with ID starting with ad150bd718746610af334360a84247f85ed67efde797271faf7967473a7afe45 not found: ID does not exist" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.068489 4778 scope.go:117] "RemoveContainer" containerID="d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee" Mar 18 09:38:03 crc kubenswrapper[4778]: E0318 09:38:03.068714 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee\": container with ID starting with d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee not found: ID does not exist" containerID="d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee" Mar 18 09:38:03 crc kubenswrapper[4778]: I0318 09:38:03.068809 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee"} err="failed to get container status \"d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee\": rpc error: code = NotFound desc = could not find container \"d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee\": container with ID starting with d9e63f9d7c971f9de98562ce22d51d4f909db6e52afb6867484f2c7fedcd8fee not found: ID does not exist" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.219669 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" path="/var/lib/kubelet/pods/eea5e508-8702-4085-b99d-43524ffd7dac/volumes" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.344749 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.466862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49pq9\" (UniqueName: \"kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9\") pod \"0548485b-4f03-47ba-8a13-4e3522451291\" (UID: \"0548485b-4f03-47ba-8a13-4e3522451291\") " Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.474730 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9" (OuterVolumeSpecName: "kube-api-access-49pq9") pod "0548485b-4f03-47ba-8a13-4e3522451291" (UID: "0548485b-4f03-47ba-8a13-4e3522451291"). InnerVolumeSpecName "kube-api-access-49pq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.569325 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49pq9\" (UniqueName: \"kubernetes.io/projected/0548485b-4f03-47ba-8a13-4e3522451291-kube-api-access-49pq9\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.941472 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.980847 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.980851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563778-4dm5p" event={"ID":"0548485b-4f03-47ba-8a13-4e3522451291","Type":"ContainerDied","Data":"8465549329bfd92b52a0d5ceef063d0f55bdc478460f4cbdf9d413424cbc4cc5"} Mar 18 09:38:04 crc kubenswrapper[4778]: I0318 09:38:04.980915 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8465549329bfd92b52a0d5ceef063d0f55bdc478460f4cbdf9d413424cbc4cc5" Mar 18 09:38:05 crc kubenswrapper[4778]: I0318 09:38:05.415547 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563772-tcj6t"] Mar 18 09:38:05 crc kubenswrapper[4778]: I0318 09:38:05.423431 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563772-tcj6t"] Mar 18 09:38:05 crc kubenswrapper[4778]: I0318 09:38:05.991838 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6v22l" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="registry-server" containerID="cri-o://68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940" gracePeriod=2 Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.198003 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15232b66-3433-4405-9feb-79055e892b3d" path="/var/lib/kubelet/pods/15232b66-3433-4405-9feb-79055e892b3d/volumes" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.435895 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.516047 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities\") pod \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.516138 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content\") pod \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.516261 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmv8p\" (UniqueName: \"kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p\") pod \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\" (UID: \"4d79e7e2-9db4-4307-8411-18b74d60b1b7\") " Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.525835 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p" (OuterVolumeSpecName: "kube-api-access-zmv8p") pod "4d79e7e2-9db4-4307-8411-18b74d60b1b7" (UID: "4d79e7e2-9db4-4307-8411-18b74d60b1b7"). InnerVolumeSpecName "kube-api-access-zmv8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.530712 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities" (OuterVolumeSpecName: "utilities") pod "4d79e7e2-9db4-4307-8411-18b74d60b1b7" (UID: "4d79e7e2-9db4-4307-8411-18b74d60b1b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.602926 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d79e7e2-9db4-4307-8411-18b74d60b1b7" (UID: "4d79e7e2-9db4-4307-8411-18b74d60b1b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.619084 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.619120 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d79e7e2-9db4-4307-8411-18b74d60b1b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:06 crc kubenswrapper[4778]: I0318 09:38:06.619134 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmv8p\" (UniqueName: \"kubernetes.io/projected/4d79e7e2-9db4-4307-8411-18b74d60b1b7-kube-api-access-zmv8p\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.005972 4778 generic.go:334] "Generic (PLEG): container finished" podID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerID="68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940" exitCode=0 Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.006010 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerDied","Data":"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940"} Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.006054 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v22l" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.006112 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v22l" event={"ID":"4d79e7e2-9db4-4307-8411-18b74d60b1b7","Type":"ContainerDied","Data":"19c45c0d8cb539fb9cbc49d484d0d217fa556329caaf0c5f55191f47880349c3"} Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.006183 4778 scope.go:117] "RemoveContainer" containerID="68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.053894 4778 scope.go:117] "RemoveContainer" containerID="5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.058706 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.066825 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6v22l"] Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.082046 4778 scope.go:117] "RemoveContainer" containerID="4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.128976 4778 scope.go:117] "RemoveContainer" containerID="68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940" Mar 18 09:38:07 crc kubenswrapper[4778]: E0318 09:38:07.129416 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940\": container with ID starting with 68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940 not found: ID does not exist" containerID="68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.129449 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940"} err="failed to get container status \"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940\": rpc error: code = NotFound desc = could not find container \"68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940\": container with ID starting with 68e82b14d35c084a0d030ac1df8114026e4ac513f03d2f3df3f5c842af6eb940 not found: ID does not exist" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.129472 4778 scope.go:117] "RemoveContainer" containerID="5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708" Mar 18 09:38:07 crc kubenswrapper[4778]: E0318 09:38:07.129798 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708\": container with ID starting with 5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708 not found: ID does not exist" containerID="5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.129825 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708"} err="failed to get container status \"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708\": rpc error: code = NotFound desc = could not find container \"5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708\": container with ID starting with 5a9c20e2309024d68e204aa480c5b00da745d10c577c8a5c43f31f3dd1cc1708 not found: ID does not exist" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.129844 4778 scope.go:117] "RemoveContainer" containerID="4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d" Mar 18 09:38:07 crc kubenswrapper[4778]: E0318 09:38:07.130373 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d\": container with ID starting with 4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d not found: ID does not exist" containerID="4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d" Mar 18 09:38:07 crc kubenswrapper[4778]: I0318 09:38:07.130401 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d"} err="failed to get container status \"4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d\": rpc error: code = NotFound desc = could not find container \"4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d\": container with ID starting with 4d8ee638d3fd1a7f6d883064a1a7249230426c185142df8525a13d3d30ce4e1d not found: ID does not exist" Mar 18 09:38:08 crc kubenswrapper[4778]: I0318 09:38:08.206031 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" path="/var/lib/kubelet/pods/4d79e7e2-9db4-4307-8411-18b74d60b1b7/volumes" Mar 18 09:38:19 crc kubenswrapper[4778]: E0318 09:38:19.190243 4778 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.70:51892->38.102.83.70:35463: write tcp 38.102.83.70:51892->38.102.83.70:35463: write: broken pipe Mar 18 09:38:31 crc kubenswrapper[4778]: I0318 09:38:31.337978 4778 scope.go:117] "RemoveContainer" containerID="c8ccd760df68dbd5ce4bef875e9b41962b50e1c9d6413d0f1f66a324748d7c49" Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.065830 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.076572 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.084608 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.090940 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.097625 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.118647 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.129216 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dxfn2"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.137705 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.145624 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fg64z"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.158230 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zd2d7"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.170616 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.180292 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xhchk"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.188220 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8b7b"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.195547 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.201151 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s6gjf"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.209123 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4rmrt"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.216090 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jgb5c"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.221544 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z5c6p"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.227587 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-h9wgf"] Mar 18 09:38:39 crc kubenswrapper[4778]: I0318 09:38:39.233096 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zd2d7"] Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.206891 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7ef620-077d-4a38-90ed-fed05ccba5d2" path="/var/lib/kubelet/pods/0b7ef620-077d-4a38-90ed-fed05ccba5d2/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.208084 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="154a89df-1c2e-4f86-bbf3-827d6443c04a" path="/var/lib/kubelet/pods/154a89df-1c2e-4f86-bbf3-827d6443c04a/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.209434 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe04bef-41cb-47c4-8031-141f8809e8cb" path="/var/lib/kubelet/pods/2fe04bef-41cb-47c4-8031-141f8809e8cb/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.210669 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637a34d6-3284-4d4f-ab9b-70fd0b4d29b1" path="/var/lib/kubelet/pods/637a34d6-3284-4d4f-ab9b-70fd0b4d29b1/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.212983 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e3f07f1-8381-48a9-8ebb-9cd3a821783f" path="/var/lib/kubelet/pods/8e3f07f1-8381-48a9-8ebb-9cd3a821783f/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.215065 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803" path="/var/lib/kubelet/pods/9c1d5f8b-1e33-427a-a3dc-f0ed91cf2803/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.215781 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b33de03b-23ec-40c0-b309-0dd2024caf71" path="/var/lib/kubelet/pods/b33de03b-23ec-40c0-b309-0dd2024caf71/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.216487 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b989f767-d1ba-49fe-aebb-6aef120e0e22" path="/var/lib/kubelet/pods/b989f767-d1ba-49fe-aebb-6aef120e0e22/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.217800 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3315b0-1ace-422a-8049-3fd13fe46e65" path="/var/lib/kubelet/pods/bf3315b0-1ace-422a-8049-3fd13fe46e65/volumes" Mar 18 09:38:40 crc kubenswrapper[4778]: I0318 09:38:40.218563 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d85dc0a9-a7b0-4715-bc9d-974ac7657337" path="/var/lib/kubelet/pods/d85dc0a9-a7b0-4715-bc9d-974ac7657337/volumes" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.695349 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx"] Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.696848 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="extract-content" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.696874 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="extract-content" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.696906 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="extract-utilities" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.696918 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="extract-utilities" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.696940 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.696952 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.696975 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="extract-content" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.696986 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="extract-content" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.697004 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697014 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.697032 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0548485b-4f03-47ba-8a13-4e3522451291" containerName="oc" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697043 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0548485b-4f03-47ba-8a13-4e3522451291" containerName="oc" Mar 18 09:38:44 crc kubenswrapper[4778]: E0318 09:38:44.697063 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="extract-utilities" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697073 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="extract-utilities" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697452 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d79e7e2-9db4-4307-8411-18b74d60b1b7" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697480 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0548485b-4f03-47ba-8a13-4e3522451291" containerName="oc" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.697511 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea5e508-8702-4085-b99d-43524ffd7dac" containerName="registry-server" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.698534 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710246 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710365 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710365 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710648 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710688 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx"] Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.710834 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.840689 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.840795 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.840925 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjgc8\" (UniqueName: \"kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.841027 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.841101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.943427 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.943546 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.943723 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjgc8\" (UniqueName: \"kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.944445 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.944717 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.957896 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.958060 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.958442 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.959984 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:44 crc kubenswrapper[4778]: I0318 09:38:44.969244 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjgc8\" (UniqueName: \"kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:45 crc kubenswrapper[4778]: I0318 09:38:45.036052 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:45 crc kubenswrapper[4778]: I0318 09:38:45.649426 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx"] Mar 18 09:38:46 crc kubenswrapper[4778]: I0318 09:38:46.415076 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" event={"ID":"136dbfab-32f1-40ee-b685-74411fbc06ba","Type":"ContainerStarted","Data":"41c9d610db46c8d317c992030f024e19d0b3f9df9d698ee26c015c45a0a0b2aa"} Mar 18 09:38:46 crc kubenswrapper[4778]: I0318 09:38:46.415470 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" event={"ID":"136dbfab-32f1-40ee-b685-74411fbc06ba","Type":"ContainerStarted","Data":"e7e8db0b2fd629dec04b95406879d289021ad93e6d064630d6901e2c718d4842"} Mar 18 09:38:46 crc kubenswrapper[4778]: I0318 09:38:46.437930 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" podStartSLOduration=1.996853192 podStartE2EDuration="2.437908633s" podCreationTimestamp="2026-03-18 09:38:44 +0000 UTC" firstStartedPulling="2026-03-18 09:38:45.658507737 +0000 UTC m=+2192.233252577" lastFinishedPulling="2026-03-18 09:38:46.099563178 +0000 UTC m=+2192.674308018" observedRunningTime="2026-03-18 09:38:46.432863586 +0000 UTC m=+2193.007608436" watchObservedRunningTime="2026-03-18 09:38:46.437908633 +0000 UTC m=+2193.012653473" Mar 18 09:38:57 crc kubenswrapper[4778]: I0318 09:38:57.519957 4778 generic.go:334] "Generic (PLEG): container finished" podID="136dbfab-32f1-40ee-b685-74411fbc06ba" containerID="41c9d610db46c8d317c992030f024e19d0b3f9df9d698ee26c015c45a0a0b2aa" exitCode=0 Mar 18 09:38:57 crc kubenswrapper[4778]: I0318 09:38:57.520058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" event={"ID":"136dbfab-32f1-40ee-b685-74411fbc06ba","Type":"ContainerDied","Data":"41c9d610db46c8d317c992030f024e19d0b3f9df9d698ee26c015c45a0a0b2aa"} Mar 18 09:38:58 crc kubenswrapper[4778]: I0318 09:38:58.951217 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.034095 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph\") pod \"136dbfab-32f1-40ee-b685-74411fbc06ba\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.034429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory\") pod \"136dbfab-32f1-40ee-b685-74411fbc06ba\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.034640 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle\") pod \"136dbfab-32f1-40ee-b685-74411fbc06ba\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.034879 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjgc8\" (UniqueName: \"kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8\") pod \"136dbfab-32f1-40ee-b685-74411fbc06ba\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.034957 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam\") pod \"136dbfab-32f1-40ee-b685-74411fbc06ba\" (UID: \"136dbfab-32f1-40ee-b685-74411fbc06ba\") " Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.041173 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8" (OuterVolumeSpecName: "kube-api-access-zjgc8") pod "136dbfab-32f1-40ee-b685-74411fbc06ba" (UID: "136dbfab-32f1-40ee-b685-74411fbc06ba"). InnerVolumeSpecName "kube-api-access-zjgc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.041711 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph" (OuterVolumeSpecName: "ceph") pod "136dbfab-32f1-40ee-b685-74411fbc06ba" (UID: "136dbfab-32f1-40ee-b685-74411fbc06ba"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.048427 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "136dbfab-32f1-40ee-b685-74411fbc06ba" (UID: "136dbfab-32f1-40ee-b685-74411fbc06ba"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.071543 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "136dbfab-32f1-40ee-b685-74411fbc06ba" (UID: "136dbfab-32f1-40ee-b685-74411fbc06ba"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.076721 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory" (OuterVolumeSpecName: "inventory") pod "136dbfab-32f1-40ee-b685-74411fbc06ba" (UID: "136dbfab-32f1-40ee-b685-74411fbc06ba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.138487 4778 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.138534 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjgc8\" (UniqueName: \"kubernetes.io/projected/136dbfab-32f1-40ee-b685-74411fbc06ba-kube-api-access-zjgc8\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.138548 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.138563 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.138576 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/136dbfab-32f1-40ee-b685-74411fbc06ba-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.546802 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" event={"ID":"136dbfab-32f1-40ee-b685-74411fbc06ba","Type":"ContainerDied","Data":"e7e8db0b2fd629dec04b95406879d289021ad93e6d064630d6901e2c718d4842"} Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.546868 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7e8db0b2fd629dec04b95406879d289021ad93e6d064630d6901e2c718d4842" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.546960 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.663984 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk"] Mar 18 09:38:59 crc kubenswrapper[4778]: E0318 09:38:59.664865 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136dbfab-32f1-40ee-b685-74411fbc06ba" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.664892 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="136dbfab-32f1-40ee-b685-74411fbc06ba" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.665149 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="136dbfab-32f1-40ee-b685-74411fbc06ba" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.665905 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.669400 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.677426 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.677600 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.680593 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.680851 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.687418 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk"] Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.754108 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.754486 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgsrs\" (UniqueName: \"kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.754586 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.754869 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.755064 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.857445 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgsrs\" (UniqueName: \"kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.857516 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.857575 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.857634 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.857688 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.863471 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.863601 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.868085 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.870756 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.881041 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgsrs\" (UniqueName: \"kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:38:59 crc kubenswrapper[4778]: I0318 09:38:59.992231 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:39:00 crc kubenswrapper[4778]: I0318 09:39:00.619659 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk"] Mar 18 09:39:00 crc kubenswrapper[4778]: W0318 09:39:00.621646 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4bddd5e_314b_49c0_963c_107e6798c40e.slice/crio-caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3 WatchSource:0}: Error finding container caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3: Status 404 returned error can't find the container with id caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3 Mar 18 09:39:01 crc kubenswrapper[4778]: I0318 09:39:01.579163 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" event={"ID":"f4bddd5e-314b-49c0-963c-107e6798c40e","Type":"ContainerStarted","Data":"f2ea8edc9c4961bcef17cef1e281edb3e1211a2e5c2d2551a850dcaae9c256c3"} Mar 18 09:39:01 crc kubenswrapper[4778]: I0318 09:39:01.579603 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" event={"ID":"f4bddd5e-314b-49c0-963c-107e6798c40e","Type":"ContainerStarted","Data":"caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3"} Mar 18 09:39:01 crc kubenswrapper[4778]: I0318 09:39:01.608563 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" podStartSLOduration=2.117923818 podStartE2EDuration="2.608531832s" podCreationTimestamp="2026-03-18 09:38:59 +0000 UTC" firstStartedPulling="2026-03-18 09:39:00.624129857 +0000 UTC m=+2207.198874697" lastFinishedPulling="2026-03-18 09:39:01.114737881 +0000 UTC m=+2207.689482711" observedRunningTime="2026-03-18 09:39:01.605921611 +0000 UTC m=+2208.180666531" watchObservedRunningTime="2026-03-18 09:39:01.608531832 +0000 UTC m=+2208.183276712" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.431640 4778 scope.go:117] "RemoveContainer" containerID="fff08d52bb8eb70989a438cee10bfa00abf8f2f0d3255a5a38ba7fa52ac6fd7d" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.466455 4778 scope.go:117] "RemoveContainer" containerID="a0e4df9a818fec5131c555c760ba72656483292d6411393207bbb36928547cc0" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.527261 4778 scope.go:117] "RemoveContainer" containerID="e73cd1a7f2cc415ed8844ab76dbcafbf6d6af8f6453df69ed69f66ec8f314dde" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.596073 4778 scope.go:117] "RemoveContainer" containerID="df748c22cbd9cfd719213cf439a446ed8f2c405ec832bdc5f38d5aacebbce9a9" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.672560 4778 scope.go:117] "RemoveContainer" containerID="fd288c3256024cadd2bb212c37d37772aadd4cda1a6ce7e57e524f08cb5c87a0" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.715384 4778 scope.go:117] "RemoveContainer" containerID="624102bc1fa0c3850d7e6900b631b277f6bac2b359e2ccf5e40af6d9d87d6742" Mar 18 09:39:31 crc kubenswrapper[4778]: I0318 09:39:31.751565 4778 scope.go:117] "RemoveContainer" containerID="9d9ac3d0a2c7513d5a45e8e6d19a1411862166adf069327b3e174ecc2c3c3a28" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.418380 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.421054 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.441836 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.533370 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.533474 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.533520 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhchc\" (UniqueName: \"kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.634583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.634643 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhchc\" (UniqueName: \"kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.634739 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.635247 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.635456 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.662413 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhchc\" (UniqueName: \"kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc\") pod \"redhat-operators-khwnk\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:37 crc kubenswrapper[4778]: I0318 09:39:37.778809 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:38 crc kubenswrapper[4778]: I0318 09:39:38.230430 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:39:38 crc kubenswrapper[4778]: I0318 09:39:38.914082 4778 generic.go:334] "Generic (PLEG): container finished" podID="eee28de9-04d9-4210-87f7-b51710f5befc" containerID="12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d" exitCode=0 Mar 18 09:39:38 crc kubenswrapper[4778]: I0318 09:39:38.914189 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerDied","Data":"12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d"} Mar 18 09:39:38 crc kubenswrapper[4778]: I0318 09:39:38.914348 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerStarted","Data":"52dcfc339804749e6cfb13e84a2f1b2c8e710e66910f70bd95fb38bf89d675f7"} Mar 18 09:39:39 crc kubenswrapper[4778]: I0318 09:39:39.923062 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerStarted","Data":"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15"} Mar 18 09:39:41 crc kubenswrapper[4778]: I0318 09:39:41.940835 4778 generic.go:334] "Generic (PLEG): container finished" podID="eee28de9-04d9-4210-87f7-b51710f5befc" containerID="8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15" exitCode=0 Mar 18 09:39:41 crc kubenswrapper[4778]: I0318 09:39:41.940903 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerDied","Data":"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15"} Mar 18 09:39:42 crc kubenswrapper[4778]: I0318 09:39:42.951943 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerStarted","Data":"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd"} Mar 18 09:39:42 crc kubenswrapper[4778]: I0318 09:39:42.971540 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-khwnk" podStartSLOduration=2.523971043 podStartE2EDuration="5.971518961s" podCreationTimestamp="2026-03-18 09:39:37 +0000 UTC" firstStartedPulling="2026-03-18 09:39:38.915472201 +0000 UTC m=+2245.490217041" lastFinishedPulling="2026-03-18 09:39:42.363020119 +0000 UTC m=+2248.937764959" observedRunningTime="2026-03-18 09:39:42.970090572 +0000 UTC m=+2249.544835412" watchObservedRunningTime="2026-03-18 09:39:42.971518961 +0000 UTC m=+2249.546263811" Mar 18 09:39:47 crc kubenswrapper[4778]: I0318 09:39:47.779679 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:47 crc kubenswrapper[4778]: I0318 09:39:47.780269 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:48 crc kubenswrapper[4778]: I0318 09:39:48.829043 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-khwnk" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="registry-server" probeResult="failure" output=< Mar 18 09:39:48 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:39:48 crc kubenswrapper[4778]: > Mar 18 09:39:57 crc kubenswrapper[4778]: I0318 09:39:57.825694 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:57 crc kubenswrapper[4778]: I0318 09:39:57.878856 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:58 crc kubenswrapper[4778]: I0318 09:39:58.065975 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.090455 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-khwnk" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="registry-server" containerID="cri-o://14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd" gracePeriod=2 Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.628250 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.702382 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content\") pod \"eee28de9-04d9-4210-87f7-b51710f5befc\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.702443 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities\") pod \"eee28de9-04d9-4210-87f7-b51710f5befc\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.702489 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhchc\" (UniqueName: \"kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc\") pod \"eee28de9-04d9-4210-87f7-b51710f5befc\" (UID: \"eee28de9-04d9-4210-87f7-b51710f5befc\") " Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.704155 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities" (OuterVolumeSpecName: "utilities") pod "eee28de9-04d9-4210-87f7-b51710f5befc" (UID: "eee28de9-04d9-4210-87f7-b51710f5befc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.711103 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc" (OuterVolumeSpecName: "kube-api-access-jhchc") pod "eee28de9-04d9-4210-87f7-b51710f5befc" (UID: "eee28de9-04d9-4210-87f7-b51710f5befc"). InnerVolumeSpecName "kube-api-access-jhchc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.805969 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.806507 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhchc\" (UniqueName: \"kubernetes.io/projected/eee28de9-04d9-4210-87f7-b51710f5befc-kube-api-access-jhchc\") on node \"crc\" DevicePath \"\"" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.826140 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eee28de9-04d9-4210-87f7-b51710f5befc" (UID: "eee28de9-04d9-4210-87f7-b51710f5befc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:39:59 crc kubenswrapper[4778]: I0318 09:39:59.908515 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee28de9-04d9-4210-87f7-b51710f5befc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.099396 4778 generic.go:334] "Generic (PLEG): container finished" podID="eee28de9-04d9-4210-87f7-b51710f5befc" containerID="14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd" exitCode=0 Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.099500 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khwnk" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.100454 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerDied","Data":"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd"} Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.100504 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khwnk" event={"ID":"eee28de9-04d9-4210-87f7-b51710f5befc","Type":"ContainerDied","Data":"52dcfc339804749e6cfb13e84a2f1b2c8e710e66910f70bd95fb38bf89d675f7"} Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.100548 4778 scope.go:117] "RemoveContainer" containerID="14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.142982 4778 scope.go:117] "RemoveContainer" containerID="8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.155549 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.164110 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563780-vgggq"] Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.164559 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="registry-server" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.164578 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="registry-server" Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.164596 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="extract-content" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.164602 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="extract-content" Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.164627 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="extract-utilities" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.164633 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="extract-utilities" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.164800 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" containerName="registry-server" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.165478 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.168007 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.168367 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.169506 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.172805 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-khwnk"] Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.173034 4778 scope.go:117] "RemoveContainer" containerID="12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.183056 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563780-vgggq"] Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.207074 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee28de9-04d9-4210-87f7-b51710f5befc" path="/var/lib/kubelet/pods/eee28de9-04d9-4210-87f7-b51710f5befc/volumes" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.217507 4778 scope.go:117] "RemoveContainer" containerID="14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd" Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.218478 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd\": container with ID starting with 14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd not found: ID does not exist" containerID="14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.218518 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd"} err="failed to get container status \"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd\": rpc error: code = NotFound desc = could not find container \"14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd\": container with ID starting with 14e3bbc7f8cfe4e0e2e7524615f738b6e667491baf8de4424c051d17c80402fd not found: ID does not exist" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.218547 4778 scope.go:117] "RemoveContainer" containerID="8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15" Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.218955 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15\": container with ID starting with 8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15 not found: ID does not exist" containerID="8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.218985 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15"} err="failed to get container status \"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15\": rpc error: code = NotFound desc = could not find container \"8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15\": container with ID starting with 8a4ef21bf52d16105c20ef794ba9555c63dfdcfcfb1ddfa5e956160429d7bb15 not found: ID does not exist" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.219001 4778 scope.go:117] "RemoveContainer" containerID="12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d" Mar 18 09:40:00 crc kubenswrapper[4778]: E0318 09:40:00.219233 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d\": container with ID starting with 12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d not found: ID does not exist" containerID="12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.219266 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d"} err="failed to get container status \"12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d\": rpc error: code = NotFound desc = could not find container \"12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d\": container with ID starting with 12a6e6236e8a16f3aaa735ba2f3c197064b5c0a3684f642388ddc5c555b02a3d not found: ID does not exist" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.317513 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgrn9\" (UniqueName: \"kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9\") pod \"auto-csr-approver-29563780-vgggq\" (UID: \"ed393452-0d17-4c60-b37b-544b21c09da1\") " pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.419439 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgrn9\" (UniqueName: \"kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9\") pod \"auto-csr-approver-29563780-vgggq\" (UID: \"ed393452-0d17-4c60-b37b-544b21c09da1\") " pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.444679 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgrn9\" (UniqueName: \"kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9\") pod \"auto-csr-approver-29563780-vgggq\" (UID: \"ed393452-0d17-4c60-b37b-544b21c09da1\") " pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.571155 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:00 crc kubenswrapper[4778]: I0318 09:40:00.999400 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563780-vgggq"] Mar 18 09:40:01 crc kubenswrapper[4778]: I0318 09:40:01.107921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563780-vgggq" event={"ID":"ed393452-0d17-4c60-b37b-544b21c09da1","Type":"ContainerStarted","Data":"114b3c91b330bc93cc75643b229353a42c117b6ac1efb3eeced8a0d86a2ebe7c"} Mar 18 09:40:03 crc kubenswrapper[4778]: I0318 09:40:03.134739 4778 generic.go:334] "Generic (PLEG): container finished" podID="ed393452-0d17-4c60-b37b-544b21c09da1" containerID="9f45f4032f3621f6cd43ea95d13369122ace0eb37b6189c6643a14332da3a74a" exitCode=0 Mar 18 09:40:03 crc kubenswrapper[4778]: I0318 09:40:03.134781 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563780-vgggq" event={"ID":"ed393452-0d17-4c60-b37b-544b21c09da1","Type":"ContainerDied","Data":"9f45f4032f3621f6cd43ea95d13369122ace0eb37b6189c6643a14332da3a74a"} Mar 18 09:40:04 crc kubenswrapper[4778]: I0318 09:40:04.500426 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:04 crc kubenswrapper[4778]: I0318 09:40:04.599427 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgrn9\" (UniqueName: \"kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9\") pod \"ed393452-0d17-4c60-b37b-544b21c09da1\" (UID: \"ed393452-0d17-4c60-b37b-544b21c09da1\") " Mar 18 09:40:04 crc kubenswrapper[4778]: I0318 09:40:04.606018 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9" (OuterVolumeSpecName: "kube-api-access-sgrn9") pod "ed393452-0d17-4c60-b37b-544b21c09da1" (UID: "ed393452-0d17-4c60-b37b-544b21c09da1"). InnerVolumeSpecName "kube-api-access-sgrn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:40:04 crc kubenswrapper[4778]: I0318 09:40:04.702022 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgrn9\" (UniqueName: \"kubernetes.io/projected/ed393452-0d17-4c60-b37b-544b21c09da1-kube-api-access-sgrn9\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:05 crc kubenswrapper[4778]: I0318 09:40:05.155882 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563780-vgggq" event={"ID":"ed393452-0d17-4c60-b37b-544b21c09da1","Type":"ContainerDied","Data":"114b3c91b330bc93cc75643b229353a42c117b6ac1efb3eeced8a0d86a2ebe7c"} Mar 18 09:40:05 crc kubenswrapper[4778]: I0318 09:40:05.155924 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563780-vgggq" Mar 18 09:40:05 crc kubenswrapper[4778]: I0318 09:40:05.155928 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="114b3c91b330bc93cc75643b229353a42c117b6ac1efb3eeced8a0d86a2ebe7c" Mar 18 09:40:05 crc kubenswrapper[4778]: I0318 09:40:05.574164 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563774-jn4n5"] Mar 18 09:40:05 crc kubenswrapper[4778]: I0318 09:40:05.583745 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563774-jn4n5"] Mar 18 09:40:06 crc kubenswrapper[4778]: I0318 09:40:06.198993 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315606e3-7197-4234-b672-400a86339d27" path="/var/lib/kubelet/pods/315606e3-7197-4234-b672-400a86339d27/volumes" Mar 18 09:40:30 crc kubenswrapper[4778]: I0318 09:40:30.147783 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:40:30 crc kubenswrapper[4778]: I0318 09:40:30.148258 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:40:31 crc kubenswrapper[4778]: I0318 09:40:31.895085 4778 scope.go:117] "RemoveContainer" containerID="3e7ed49b01f49625749fbe5496f4ace13851a2f40c5fbcd9633d28b842edcbb0" Mar 18 09:40:31 crc kubenswrapper[4778]: I0318 09:40:31.964569 4778 scope.go:117] "RemoveContainer" containerID="16287400dde13e1c43e43b5d82fb77152a697e54c70364b3f589e427eeb0ea66" Mar 18 09:40:32 crc kubenswrapper[4778]: I0318 09:40:32.014705 4778 scope.go:117] "RemoveContainer" containerID="40ec8676e1c1ccc56a543cb5a42de5f34fb52e822133f45c8ebee2d755dab39f" Mar 18 09:40:39 crc kubenswrapper[4778]: I0318 09:40:39.467281 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4bddd5e-314b-49c0-963c-107e6798c40e" containerID="f2ea8edc9c4961bcef17cef1e281edb3e1211a2e5c2d2551a850dcaae9c256c3" exitCode=0 Mar 18 09:40:39 crc kubenswrapper[4778]: I0318 09:40:39.467351 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" event={"ID":"f4bddd5e-314b-49c0-963c-107e6798c40e","Type":"ContainerDied","Data":"f2ea8edc9c4961bcef17cef1e281edb3e1211a2e5c2d2551a850dcaae9c256c3"} Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.946318 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.970287 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle\") pod \"f4bddd5e-314b-49c0-963c-107e6798c40e\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.970381 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory\") pod \"f4bddd5e-314b-49c0-963c-107e6798c40e\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.973254 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgsrs\" (UniqueName: \"kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs\") pod \"f4bddd5e-314b-49c0-963c-107e6798c40e\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.973347 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam\") pod \"f4bddd5e-314b-49c0-963c-107e6798c40e\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.973373 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph\") pod \"f4bddd5e-314b-49c0-963c-107e6798c40e\" (UID: \"f4bddd5e-314b-49c0-963c-107e6798c40e\") " Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.985572 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph" (OuterVolumeSpecName: "ceph") pod "f4bddd5e-314b-49c0-963c-107e6798c40e" (UID: "f4bddd5e-314b-49c0-963c-107e6798c40e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.985593 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f4bddd5e-314b-49c0-963c-107e6798c40e" (UID: "f4bddd5e-314b-49c0-963c-107e6798c40e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:40:40 crc kubenswrapper[4778]: I0318 09:40:40.985598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs" (OuterVolumeSpecName: "kube-api-access-sgsrs") pod "f4bddd5e-314b-49c0-963c-107e6798c40e" (UID: "f4bddd5e-314b-49c0-963c-107e6798c40e"). InnerVolumeSpecName "kube-api-access-sgsrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.005179 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory" (OuterVolumeSpecName: "inventory") pod "f4bddd5e-314b-49c0-963c-107e6798c40e" (UID: "f4bddd5e-314b-49c0-963c-107e6798c40e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.005532 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f4bddd5e-314b-49c0-963c-107e6798c40e" (UID: "f4bddd5e-314b-49c0-963c-107e6798c40e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.075729 4778 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.075769 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.075780 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgsrs\" (UniqueName: \"kubernetes.io/projected/f4bddd5e-314b-49c0-963c-107e6798c40e-kube-api-access-sgsrs\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.075788 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.075796 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4bddd5e-314b-49c0-963c-107e6798c40e-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.492944 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" event={"ID":"f4bddd5e-314b-49c0-963c-107e6798c40e","Type":"ContainerDied","Data":"caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3"} Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.492985 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.492991 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caac60756c71869f718f55d75bf41c099232c88f6ff497cf326f1789ff881da3" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.599888 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5"] Mar 18 09:40:41 crc kubenswrapper[4778]: E0318 09:40:41.600338 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4bddd5e-314b-49c0-963c-107e6798c40e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.600361 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4bddd5e-314b-49c0-963c-107e6798c40e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:40:41 crc kubenswrapper[4778]: E0318 09:40:41.600378 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed393452-0d17-4c60-b37b-544b21c09da1" containerName="oc" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.600387 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed393452-0d17-4c60-b37b-544b21c09da1" containerName="oc" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.600780 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4bddd5e-314b-49c0-963c-107e6798c40e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.600819 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed393452-0d17-4c60-b37b-544b21c09da1" containerName="oc" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.601585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.606526 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.606526 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.606614 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.606702 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.606771 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.624887 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5"] Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.684297 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhwpk\" (UniqueName: \"kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.684432 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.684595 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.684744 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.786872 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhwpk\" (UniqueName: \"kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.787012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.787128 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.787265 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.792280 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.793092 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.794145 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.806249 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhwpk\" (UniqueName: \"kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:41 crc kubenswrapper[4778]: I0318 09:40:41.920262 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:40:42 crc kubenswrapper[4778]: I0318 09:40:42.544311 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5"] Mar 18 09:40:43 crc kubenswrapper[4778]: I0318 09:40:43.515027 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" event={"ID":"d44d6afe-0030-4d9d-9fa7-f75274eff578","Type":"ContainerStarted","Data":"101b85884f51ecbe703c99472efa3468c6a22a1f91c36bdb3336321d929be59a"} Mar 18 09:40:43 crc kubenswrapper[4778]: I0318 09:40:43.515439 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" event={"ID":"d44d6afe-0030-4d9d-9fa7-f75274eff578","Type":"ContainerStarted","Data":"28aece455218dcc9f1c2d64fb9c61409c6f2bbf3a12733a124354a4dba544ba1"} Mar 18 09:40:43 crc kubenswrapper[4778]: I0318 09:40:43.540964 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" podStartSLOduration=1.9570557979999998 podStartE2EDuration="2.54089267s" podCreationTimestamp="2026-03-18 09:40:41 +0000 UTC" firstStartedPulling="2026-03-18 09:40:42.558728867 +0000 UTC m=+2309.133473717" lastFinishedPulling="2026-03-18 09:40:43.142565759 +0000 UTC m=+2309.717310589" observedRunningTime="2026-03-18 09:40:43.537529879 +0000 UTC m=+2310.112274759" watchObservedRunningTime="2026-03-18 09:40:43.54089267 +0000 UTC m=+2310.115637550" Mar 18 09:41:00 crc kubenswrapper[4778]: I0318 09:41:00.147246 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:41:00 crc kubenswrapper[4778]: I0318 09:41:00.147768 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:41:08 crc kubenswrapper[4778]: I0318 09:41:08.735460 4778 generic.go:334] "Generic (PLEG): container finished" podID="d44d6afe-0030-4d9d-9fa7-f75274eff578" containerID="101b85884f51ecbe703c99472efa3468c6a22a1f91c36bdb3336321d929be59a" exitCode=0 Mar 18 09:41:08 crc kubenswrapper[4778]: I0318 09:41:08.735545 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" event={"ID":"d44d6afe-0030-4d9d-9fa7-f75274eff578","Type":"ContainerDied","Data":"101b85884f51ecbe703c99472efa3468c6a22a1f91c36bdb3336321d929be59a"} Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.178567 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.231661 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam\") pod \"d44d6afe-0030-4d9d-9fa7-f75274eff578\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.231814 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhwpk\" (UniqueName: \"kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk\") pod \"d44d6afe-0030-4d9d-9fa7-f75274eff578\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.231881 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory\") pod \"d44d6afe-0030-4d9d-9fa7-f75274eff578\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.232003 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph\") pod \"d44d6afe-0030-4d9d-9fa7-f75274eff578\" (UID: \"d44d6afe-0030-4d9d-9fa7-f75274eff578\") " Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.237891 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph" (OuterVolumeSpecName: "ceph") pod "d44d6afe-0030-4d9d-9fa7-f75274eff578" (UID: "d44d6afe-0030-4d9d-9fa7-f75274eff578"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.238585 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk" (OuterVolumeSpecName: "kube-api-access-rhwpk") pod "d44d6afe-0030-4d9d-9fa7-f75274eff578" (UID: "d44d6afe-0030-4d9d-9fa7-f75274eff578"). InnerVolumeSpecName "kube-api-access-rhwpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.276331 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d44d6afe-0030-4d9d-9fa7-f75274eff578" (UID: "d44d6afe-0030-4d9d-9fa7-f75274eff578"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.277985 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory" (OuterVolumeSpecName: "inventory") pod "d44d6afe-0030-4d9d-9fa7-f75274eff578" (UID: "d44d6afe-0030-4d9d-9fa7-f75274eff578"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.334989 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.335033 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhwpk\" (UniqueName: \"kubernetes.io/projected/d44d6afe-0030-4d9d-9fa7-f75274eff578-kube-api-access-rhwpk\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.335047 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.335058 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d44d6afe-0030-4d9d-9fa7-f75274eff578-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.759757 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" event={"ID":"d44d6afe-0030-4d9d-9fa7-f75274eff578","Type":"ContainerDied","Data":"28aece455218dcc9f1c2d64fb9c61409c6f2bbf3a12733a124354a4dba544ba1"} Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.759799 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28aece455218dcc9f1c2d64fb9c61409c6f2bbf3a12733a124354a4dba544ba1" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.759849 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.856559 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9"] Mar 18 09:41:10 crc kubenswrapper[4778]: E0318 09:41:10.857309 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d44d6afe-0030-4d9d-9fa7-f75274eff578" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.857340 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d44d6afe-0030-4d9d-9fa7-f75274eff578" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.857572 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d44d6afe-0030-4d9d-9fa7-f75274eff578" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.858416 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.861157 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.861309 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.861354 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.861440 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.862364 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.868825 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9"] Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.947338 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.947415 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms2xz\" (UniqueName: \"kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.947460 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:10 crc kubenswrapper[4778]: I0318 09:41:10.947492 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.049689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.049769 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms2xz\" (UniqueName: \"kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.049821 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.049855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.055399 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.055403 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.056162 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.066717 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms2xz\" (UniqueName: \"kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-44vc9\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.210421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.734591 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9"] Mar 18 09:41:11 crc kubenswrapper[4778]: W0318 09:41:11.737529 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e5ecb95_ba90_4f70_ae42_63e71026ffef.slice/crio-5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53 WatchSource:0}: Error finding container 5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53: Status 404 returned error can't find the container with id 5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53 Mar 18 09:41:11 crc kubenswrapper[4778]: I0318 09:41:11.768819 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" event={"ID":"5e5ecb95-ba90-4f70-ae42-63e71026ffef","Type":"ContainerStarted","Data":"5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53"} Mar 18 09:41:12 crc kubenswrapper[4778]: I0318 09:41:12.778249 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" event={"ID":"5e5ecb95-ba90-4f70-ae42-63e71026ffef","Type":"ContainerStarted","Data":"27c9136b748cf6c2f18636f6fd7d7d19fbcf9b96dbd5072132c2ae8ae1540e5b"} Mar 18 09:41:17 crc kubenswrapper[4778]: I0318 09:41:17.823006 4778 generic.go:334] "Generic (PLEG): container finished" podID="5e5ecb95-ba90-4f70-ae42-63e71026ffef" containerID="27c9136b748cf6c2f18636f6fd7d7d19fbcf9b96dbd5072132c2ae8ae1540e5b" exitCode=0 Mar 18 09:41:17 crc kubenswrapper[4778]: I0318 09:41:17.823132 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" event={"ID":"5e5ecb95-ba90-4f70-ae42-63e71026ffef","Type":"ContainerDied","Data":"27c9136b748cf6c2f18636f6fd7d7d19fbcf9b96dbd5072132c2ae8ae1540e5b"} Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.271293 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.334077 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph\") pod \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.334140 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam\") pod \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.334238 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory\") pod \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.334453 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms2xz\" (UniqueName: \"kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz\") pod \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\" (UID: \"5e5ecb95-ba90-4f70-ae42-63e71026ffef\") " Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.340786 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz" (OuterVolumeSpecName: "kube-api-access-ms2xz") pod "5e5ecb95-ba90-4f70-ae42-63e71026ffef" (UID: "5e5ecb95-ba90-4f70-ae42-63e71026ffef"). InnerVolumeSpecName "kube-api-access-ms2xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.343014 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph" (OuterVolumeSpecName: "ceph") pod "5e5ecb95-ba90-4f70-ae42-63e71026ffef" (UID: "5e5ecb95-ba90-4f70-ae42-63e71026ffef"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.364621 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5e5ecb95-ba90-4f70-ae42-63e71026ffef" (UID: "5e5ecb95-ba90-4f70-ae42-63e71026ffef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.379286 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory" (OuterVolumeSpecName: "inventory") pod "5e5ecb95-ba90-4f70-ae42-63e71026ffef" (UID: "5e5ecb95-ba90-4f70-ae42-63e71026ffef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.437189 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms2xz\" (UniqueName: \"kubernetes.io/projected/5e5ecb95-ba90-4f70-ae42-63e71026ffef-kube-api-access-ms2xz\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.437249 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.437264 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.437276 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ecb95-ba90-4f70-ae42-63e71026ffef-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.848684 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" event={"ID":"5e5ecb95-ba90-4f70-ae42-63e71026ffef","Type":"ContainerDied","Data":"5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53"} Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.848767 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5362db78a90a4c66c68f2fb7c300d45fbeafe52062ec9d6f77048c372ab29d53" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.848707 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-44vc9" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.943251 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82"] Mar 18 09:41:19 crc kubenswrapper[4778]: E0318 09:41:19.943584 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5ecb95-ba90-4f70-ae42-63e71026ffef" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.943616 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5ecb95-ba90-4f70-ae42-63e71026ffef" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.943783 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e5ecb95-ba90-4f70-ae42-63e71026ffef" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.944599 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.947754 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.948393 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.948875 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.950974 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.955655 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:41:19 crc kubenswrapper[4778]: I0318 09:41:19.962012 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82"] Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.047653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.047735 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.047894 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.047939 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdnqj\" (UniqueName: \"kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.150494 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.151585 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdnqj\" (UniqueName: \"kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.151825 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.151923 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.156169 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.156589 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.158859 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.179014 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdnqj\" (UniqueName: \"kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9zw82\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.265918 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.620966 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82"] Mar 18 09:41:20 crc kubenswrapper[4778]: I0318 09:41:20.860785 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" event={"ID":"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50","Type":"ContainerStarted","Data":"b0e26c8012e479cf5cd6cde6333b5f580c7eeb43d328019a02a961863e5192bc"} Mar 18 09:41:22 crc kubenswrapper[4778]: I0318 09:41:22.880765 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" event={"ID":"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50","Type":"ContainerStarted","Data":"478f434742f905b8f2086a34869fb64730d5134c202716140a2d3e6b0f090ffb"} Mar 18 09:41:22 crc kubenswrapper[4778]: I0318 09:41:22.912438 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" podStartSLOduration=2.398803073 podStartE2EDuration="3.912417428s" podCreationTimestamp="2026-03-18 09:41:19 +0000 UTC" firstStartedPulling="2026-03-18 09:41:20.624517976 +0000 UTC m=+2347.199262816" lastFinishedPulling="2026-03-18 09:41:22.138132331 +0000 UTC m=+2348.712877171" observedRunningTime="2026-03-18 09:41:22.902615142 +0000 UTC m=+2349.477360022" watchObservedRunningTime="2026-03-18 09:41:22.912417428 +0000 UTC m=+2349.487162288" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.147763 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.148618 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.148685 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.149833 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.149940 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" gracePeriod=600 Mar 18 09:41:30 crc kubenswrapper[4778]: E0318 09:41:30.275017 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.954990 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" exitCode=0 Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.955038 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9"} Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.955105 4778 scope.go:117] "RemoveContainer" containerID="0190a17d7c9066ee03765200824080dd651faade8ac0a106f1878805db93f225" Mar 18 09:41:30 crc kubenswrapper[4778]: I0318 09:41:30.955967 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:41:30 crc kubenswrapper[4778]: E0318 09:41:30.956346 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:41:32 crc kubenswrapper[4778]: I0318 09:41:32.146318 4778 scope.go:117] "RemoveContainer" containerID="ae502ee49eb38287e9aaaa3ab0077cb1ef93b09b02a84b50a76e8fa209d6ad0c" Mar 18 09:41:41 crc kubenswrapper[4778]: I0318 09:41:41.187081 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:41:41 crc kubenswrapper[4778]: E0318 09:41:41.188121 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:41:54 crc kubenswrapper[4778]: I0318 09:41:54.191723 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:41:54 crc kubenswrapper[4778]: E0318 09:41:54.192649 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:41:57 crc kubenswrapper[4778]: I0318 09:41:57.233057 4778 generic.go:334] "Generic (PLEG): container finished" podID="5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" containerID="478f434742f905b8f2086a34869fb64730d5134c202716140a2d3e6b0f090ffb" exitCode=0 Mar 18 09:41:57 crc kubenswrapper[4778]: I0318 09:41:57.233157 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" event={"ID":"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50","Type":"ContainerDied","Data":"478f434742f905b8f2086a34869fb64730d5134c202716140a2d3e6b0f090ffb"} Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.723628 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.829278 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph\") pod \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.829374 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdnqj\" (UniqueName: \"kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj\") pod \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.829539 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam\") pod \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.829662 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory\") pod \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\" (UID: \"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50\") " Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.886391 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph" (OuterVolumeSpecName: "ceph") pod "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" (UID: "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.886904 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj" (OuterVolumeSpecName: "kube-api-access-jdnqj") pod "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" (UID: "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50"). InnerVolumeSpecName "kube-api-access-jdnqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.892880 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" (UID: "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.892916 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory" (OuterVolumeSpecName: "inventory") pod "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" (UID: "5e5ffed6-fceb-4d38-aa29-e9836a8d9f50"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.932600 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.932638 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdnqj\" (UniqueName: \"kubernetes.io/projected/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-kube-api-access-jdnqj\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.932651 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:58 crc kubenswrapper[4778]: I0318 09:41:58.932662 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ffed6-fceb-4d38-aa29-e9836a8d9f50-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.255238 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" event={"ID":"5e5ffed6-fceb-4d38-aa29-e9836a8d9f50","Type":"ContainerDied","Data":"b0e26c8012e479cf5cd6cde6333b5f580c7eeb43d328019a02a961863e5192bc"} Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.255273 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0e26c8012e479cf5cd6cde6333b5f580c7eeb43d328019a02a961863e5192bc" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.255329 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9zw82" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.355168 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl"] Mar 18 09:41:59 crc kubenswrapper[4778]: E0318 09:41:59.355708 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.355736 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.355982 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e5ffed6-fceb-4d38-aa29-e9836a8d9f50" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.356928 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.362929 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl"] Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.368010 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.368271 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.368472 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.369824 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.372073 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.444258 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4775b\" (UniqueName: \"kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.444313 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.444338 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.444374 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.547679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4775b\" (UniqueName: \"kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.547793 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.547834 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.547890 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.559857 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.559875 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.562538 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.565124 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4775b\" (UniqueName: \"kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:41:59 crc kubenswrapper[4778]: I0318 09:41:59.679520 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.011747 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl"] Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.137587 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563782-zn5kg"] Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.138623 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.140988 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.141721 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.142291 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.152007 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563782-zn5kg"] Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.262830 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln\") pod \"auto-csr-approver-29563782-zn5kg\" (UID: \"d3895116-2d67-4e3c-9f3e-e04d3cfe0518\") " pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.263688 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" event={"ID":"34acd7f6-6263-4871-892c-02835ebbab27","Type":"ContainerStarted","Data":"64dff72ef9bbd5903ef3f081c89b41c6a089a4023a0fbd63c67b48c1ad47875d"} Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.365448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln\") pod \"auto-csr-approver-29563782-zn5kg\" (UID: \"d3895116-2d67-4e3c-9f3e-e04d3cfe0518\") " pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.391058 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln\") pod \"auto-csr-approver-29563782-zn5kg\" (UID: \"d3895116-2d67-4e3c-9f3e-e04d3cfe0518\") " pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.460588 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:00 crc kubenswrapper[4778]: I0318 09:42:00.933507 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563782-zn5kg"] Mar 18 09:42:00 crc kubenswrapper[4778]: W0318 09:42:00.938786 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3895116_2d67_4e3c_9f3e_e04d3cfe0518.slice/crio-4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6 WatchSource:0}: Error finding container 4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6: Status 404 returned error can't find the container with id 4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6 Mar 18 09:42:01 crc kubenswrapper[4778]: I0318 09:42:01.272906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" event={"ID":"d3895116-2d67-4e3c-9f3e-e04d3cfe0518","Type":"ContainerStarted","Data":"4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6"} Mar 18 09:42:01 crc kubenswrapper[4778]: I0318 09:42:01.275276 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" event={"ID":"34acd7f6-6263-4871-892c-02835ebbab27","Type":"ContainerStarted","Data":"be013a651fa0255702eeae0ccf13a8dfade5d16e54b7dd5fc488b36567907797"} Mar 18 09:42:01 crc kubenswrapper[4778]: I0318 09:42:01.316287 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" podStartSLOduration=1.910295154 podStartE2EDuration="2.316262393s" podCreationTimestamp="2026-03-18 09:41:59 +0000 UTC" firstStartedPulling="2026-03-18 09:42:00.018585542 +0000 UTC m=+2386.593330392" lastFinishedPulling="2026-03-18 09:42:00.424552791 +0000 UTC m=+2386.999297631" observedRunningTime="2026-03-18 09:42:01.297276698 +0000 UTC m=+2387.872021568" watchObservedRunningTime="2026-03-18 09:42:01.316262393 +0000 UTC m=+2387.891007263" Mar 18 09:42:02 crc kubenswrapper[4778]: I0318 09:42:02.282843 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" event={"ID":"d3895116-2d67-4e3c-9f3e-e04d3cfe0518","Type":"ContainerStarted","Data":"c1ff920321931ae33985b21e2caf3b4db031e28a61271d5d1f2c36e681d955e6"} Mar 18 09:42:02 crc kubenswrapper[4778]: I0318 09:42:02.305918 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" podStartSLOduration=1.369680112 podStartE2EDuration="2.305892489s" podCreationTimestamp="2026-03-18 09:42:00 +0000 UTC" firstStartedPulling="2026-03-18 09:42:00.94406543 +0000 UTC m=+2387.518810280" lastFinishedPulling="2026-03-18 09:42:01.880277817 +0000 UTC m=+2388.455022657" observedRunningTime="2026-03-18 09:42:02.296643248 +0000 UTC m=+2388.871388098" watchObservedRunningTime="2026-03-18 09:42:02.305892489 +0000 UTC m=+2388.880637339" Mar 18 09:42:03 crc kubenswrapper[4778]: I0318 09:42:03.291748 4778 generic.go:334] "Generic (PLEG): container finished" podID="d3895116-2d67-4e3c-9f3e-e04d3cfe0518" containerID="c1ff920321931ae33985b21e2caf3b4db031e28a61271d5d1f2c36e681d955e6" exitCode=0 Mar 18 09:42:03 crc kubenswrapper[4778]: I0318 09:42:03.291851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" event={"ID":"d3895116-2d67-4e3c-9f3e-e04d3cfe0518","Type":"ContainerDied","Data":"c1ff920321931ae33985b21e2caf3b4db031e28a61271d5d1f2c36e681d955e6"} Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.306174 4778 generic.go:334] "Generic (PLEG): container finished" podID="34acd7f6-6263-4871-892c-02835ebbab27" containerID="be013a651fa0255702eeae0ccf13a8dfade5d16e54b7dd5fc488b36567907797" exitCode=0 Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.306291 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" event={"ID":"34acd7f6-6263-4871-892c-02835ebbab27","Type":"ContainerDied","Data":"be013a651fa0255702eeae0ccf13a8dfade5d16e54b7dd5fc488b36567907797"} Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.651427 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.746470 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln\") pod \"d3895116-2d67-4e3c-9f3e-e04d3cfe0518\" (UID: \"d3895116-2d67-4e3c-9f3e-e04d3cfe0518\") " Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.753109 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln" (OuterVolumeSpecName: "kube-api-access-tbbln") pod "d3895116-2d67-4e3c-9f3e-e04d3cfe0518" (UID: "d3895116-2d67-4e3c-9f3e-e04d3cfe0518"). InnerVolumeSpecName "kube-api-access-tbbln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:42:04 crc kubenswrapper[4778]: I0318 09:42:04.848773 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/d3895116-2d67-4e3c-9f3e-e04d3cfe0518-kube-api-access-tbbln\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.319835 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.320011 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563782-zn5kg" event={"ID":"d3895116-2d67-4e3c-9f3e-e04d3cfe0518","Type":"ContainerDied","Data":"4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6"} Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.320401 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d07ed75007e2ce8c60dbdf1f56e8a7facf36f0283e4917d0b5f7757b08dd0b6" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.379371 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563776-bxwcm"] Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.386741 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563776-bxwcm"] Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.730680 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.868211 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam\") pod \"34acd7f6-6263-4871-892c-02835ebbab27\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.868341 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph\") pod \"34acd7f6-6263-4871-892c-02835ebbab27\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.868400 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4775b\" (UniqueName: \"kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b\") pod \"34acd7f6-6263-4871-892c-02835ebbab27\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.868539 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory\") pod \"34acd7f6-6263-4871-892c-02835ebbab27\" (UID: \"34acd7f6-6263-4871-892c-02835ebbab27\") " Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.872848 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph" (OuterVolumeSpecName: "ceph") pod "34acd7f6-6263-4871-892c-02835ebbab27" (UID: "34acd7f6-6263-4871-892c-02835ebbab27"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.873300 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b" (OuterVolumeSpecName: "kube-api-access-4775b") pod "34acd7f6-6263-4871-892c-02835ebbab27" (UID: "34acd7f6-6263-4871-892c-02835ebbab27"). InnerVolumeSpecName "kube-api-access-4775b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.895328 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "34acd7f6-6263-4871-892c-02835ebbab27" (UID: "34acd7f6-6263-4871-892c-02835ebbab27"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.899592 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory" (OuterVolumeSpecName: "inventory") pod "34acd7f6-6263-4871-892c-02835ebbab27" (UID: "34acd7f6-6263-4871-892c-02835ebbab27"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.970868 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.970898 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4775b\" (UniqueName: \"kubernetes.io/projected/34acd7f6-6263-4871-892c-02835ebbab27-kube-api-access-4775b\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.970910 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:05 crc kubenswrapper[4778]: I0318 09:42:05.970922 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34acd7f6-6263-4871-892c-02835ebbab27-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.196964 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b14b14c0-2e4e-420d-bdba-234de9130e4a" path="/var/lib/kubelet/pods/b14b14c0-2e4e-420d-bdba-234de9130e4a/volumes" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.330610 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" event={"ID":"34acd7f6-6263-4871-892c-02835ebbab27","Type":"ContainerDied","Data":"64dff72ef9bbd5903ef3f081c89b41c6a089a4023a0fbd63c67b48c1ad47875d"} Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.330651 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64dff72ef9bbd5903ef3f081c89b41c6a089a4023a0fbd63c67b48c1ad47875d" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.330690 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.399851 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk"] Mar 18 09:42:06 crc kubenswrapper[4778]: E0318 09:42:06.402437 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3895116-2d67-4e3c-9f3e-e04d3cfe0518" containerName="oc" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.402460 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3895116-2d67-4e3c-9f3e-e04d3cfe0518" containerName="oc" Mar 18 09:42:06 crc kubenswrapper[4778]: E0318 09:42:06.402474 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34acd7f6-6263-4871-892c-02835ebbab27" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.402505 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="34acd7f6-6263-4871-892c-02835ebbab27" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.402745 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3895116-2d67-4e3c-9f3e-e04d3cfe0518" containerName="oc" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.402770 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="34acd7f6-6263-4871-892c-02835ebbab27" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.403478 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.405673 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.405734 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.406055 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.406766 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.407760 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.412181 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk"] Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.478732 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.478801 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.479345 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6pd7\" (UniqueName: \"kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.479508 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.581761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6pd7\" (UniqueName: \"kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.581809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.581841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.581865 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.586931 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.595120 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.595162 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.597823 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6pd7\" (UniqueName: \"kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-r64nk\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:06 crc kubenswrapper[4778]: I0318 09:42:06.760915 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:07 crc kubenswrapper[4778]: I0318 09:42:07.244979 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk"] Mar 18 09:42:07 crc kubenswrapper[4778]: I0318 09:42:07.339463 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" event={"ID":"4f5bf2d2-78b2-4358-a582-482ab3020da3","Type":"ContainerStarted","Data":"44811a06483085aaef916ab0be67a0c9b5c5146057332aa0b5f277cb983c4586"} Mar 18 09:42:08 crc kubenswrapper[4778]: I0318 09:42:08.187629 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:42:08 crc kubenswrapper[4778]: E0318 09:42:08.188181 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:42:08 crc kubenswrapper[4778]: I0318 09:42:08.351587 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" event={"ID":"4f5bf2d2-78b2-4358-a582-482ab3020da3","Type":"ContainerStarted","Data":"dbcc106a61d73088c3c96f953c97d1eef401b54afb2440293b042938c4ada77f"} Mar 18 09:42:08 crc kubenswrapper[4778]: I0318 09:42:08.382937 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" podStartSLOduration=1.9631508709999999 podStartE2EDuration="2.382912517s" podCreationTimestamp="2026-03-18 09:42:06 +0000 UTC" firstStartedPulling="2026-03-18 09:42:07.245491804 +0000 UTC m=+2393.820236654" lastFinishedPulling="2026-03-18 09:42:07.66525344 +0000 UTC m=+2394.239998300" observedRunningTime="2026-03-18 09:42:08.376436752 +0000 UTC m=+2394.951181662" watchObservedRunningTime="2026-03-18 09:42:08.382912517 +0000 UTC m=+2394.957657397" Mar 18 09:42:19 crc kubenswrapper[4778]: I0318 09:42:19.187708 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:42:19 crc kubenswrapper[4778]: E0318 09:42:19.188782 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:42:30 crc kubenswrapper[4778]: I0318 09:42:30.187575 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:42:30 crc kubenswrapper[4778]: E0318 09:42:30.189440 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:42:32 crc kubenswrapper[4778]: I0318 09:42:32.217792 4778 scope.go:117] "RemoveContainer" containerID="037bb0f9fdf9935b25af1bbd8db6391c200ce1a888406ad48350f6fbf2f0253c" Mar 18 09:42:45 crc kubenswrapper[4778]: I0318 09:42:45.187911 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:42:45 crc kubenswrapper[4778]: E0318 09:42:45.189503 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:42:45 crc kubenswrapper[4778]: I0318 09:42:45.670544 4778 generic.go:334] "Generic (PLEG): container finished" podID="4f5bf2d2-78b2-4358-a582-482ab3020da3" containerID="dbcc106a61d73088c3c96f953c97d1eef401b54afb2440293b042938c4ada77f" exitCode=0 Mar 18 09:42:45 crc kubenswrapper[4778]: I0318 09:42:45.670603 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" event={"ID":"4f5bf2d2-78b2-4358-a582-482ab3020da3","Type":"ContainerDied","Data":"dbcc106a61d73088c3c96f953c97d1eef401b54afb2440293b042938c4ada77f"} Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.092918 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.212439 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory\") pod \"4f5bf2d2-78b2-4358-a582-482ab3020da3\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.212508 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam\") pod \"4f5bf2d2-78b2-4358-a582-482ab3020da3\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.212595 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph\") pod \"4f5bf2d2-78b2-4358-a582-482ab3020da3\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.212645 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6pd7\" (UniqueName: \"kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7\") pod \"4f5bf2d2-78b2-4358-a582-482ab3020da3\" (UID: \"4f5bf2d2-78b2-4358-a582-482ab3020da3\") " Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.219516 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7" (OuterVolumeSpecName: "kube-api-access-p6pd7") pod "4f5bf2d2-78b2-4358-a582-482ab3020da3" (UID: "4f5bf2d2-78b2-4358-a582-482ab3020da3"). InnerVolumeSpecName "kube-api-access-p6pd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.219622 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph" (OuterVolumeSpecName: "ceph") pod "4f5bf2d2-78b2-4358-a582-482ab3020da3" (UID: "4f5bf2d2-78b2-4358-a582-482ab3020da3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.239357 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory" (OuterVolumeSpecName: "inventory") pod "4f5bf2d2-78b2-4358-a582-482ab3020da3" (UID: "4f5bf2d2-78b2-4358-a582-482ab3020da3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.240249 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4f5bf2d2-78b2-4358-a582-482ab3020da3" (UID: "4f5bf2d2-78b2-4358-a582-482ab3020da3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.314732 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.314769 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6pd7\" (UniqueName: \"kubernetes.io/projected/4f5bf2d2-78b2-4358-a582-482ab3020da3-kube-api-access-p6pd7\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.314777 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.314787 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f5bf2d2-78b2-4358-a582-482ab3020da3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.692859 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" event={"ID":"4f5bf2d2-78b2-4358-a582-482ab3020da3","Type":"ContainerDied","Data":"44811a06483085aaef916ab0be67a0c9b5c5146057332aa0b5f277cb983c4586"} Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.692907 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44811a06483085aaef916ab0be67a0c9b5c5146057332aa0b5f277cb983c4586" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.692922 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-r64nk" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.789500 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j74ts"] Mar 18 09:42:47 crc kubenswrapper[4778]: E0318 09:42:47.789846 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5bf2d2-78b2-4358-a582-482ab3020da3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.789863 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5bf2d2-78b2-4358-a582-482ab3020da3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.790040 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5bf2d2-78b2-4358-a582-482ab3020da3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.790620 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.793645 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.793929 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.794603 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.799139 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.799655 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.803227 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j74ts"] Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.929372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.929843 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.930050 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:47 crc kubenswrapper[4778]: I0318 09:42:47.930247 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l272r\" (UniqueName: \"kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.055063 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.055179 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.055343 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l272r\" (UniqueName: \"kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.055465 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.061074 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.062024 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.062180 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.073111 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l272r\" (UniqueName: \"kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r\") pod \"ssh-known-hosts-edpm-deployment-j74ts\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.107750 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.651436 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j74ts"] Mar 18 09:42:48 crc kubenswrapper[4778]: I0318 09:42:48.702087 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" event={"ID":"53b18647-af19-457c-9543-2156c1ace738","Type":"ContainerStarted","Data":"0fee9be56247863c641ddf6eb6613d75b7610950defcf4b488a7e3467f580b16"} Mar 18 09:42:49 crc kubenswrapper[4778]: I0318 09:42:49.712718 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" event={"ID":"53b18647-af19-457c-9543-2156c1ace738","Type":"ContainerStarted","Data":"332cb2eccc04e7ff5891be1d6080a18ce1ecf2c442a1afa15fca75f54c50a428"} Mar 18 09:42:58 crc kubenswrapper[4778]: I0318 09:42:58.187820 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:42:58 crc kubenswrapper[4778]: E0318 09:42:58.189153 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:42:58 crc kubenswrapper[4778]: I0318 09:42:58.828550 4778 generic.go:334] "Generic (PLEG): container finished" podID="53b18647-af19-457c-9543-2156c1ace738" containerID="332cb2eccc04e7ff5891be1d6080a18ce1ecf2c442a1afa15fca75f54c50a428" exitCode=0 Mar 18 09:42:58 crc kubenswrapper[4778]: I0318 09:42:58.828603 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" event={"ID":"53b18647-af19-457c-9543-2156c1ace738","Type":"ContainerDied","Data":"332cb2eccc04e7ff5891be1d6080a18ce1ecf2c442a1afa15fca75f54c50a428"} Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.259088 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.377494 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0\") pod \"53b18647-af19-457c-9543-2156c1ace738\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.377745 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph\") pod \"53b18647-af19-457c-9543-2156c1ace738\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.377790 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam\") pod \"53b18647-af19-457c-9543-2156c1ace738\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.377831 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l272r\" (UniqueName: \"kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r\") pod \"53b18647-af19-457c-9543-2156c1ace738\" (UID: \"53b18647-af19-457c-9543-2156c1ace738\") " Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.382985 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph" (OuterVolumeSpecName: "ceph") pod "53b18647-af19-457c-9543-2156c1ace738" (UID: "53b18647-af19-457c-9543-2156c1ace738"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.386283 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r" (OuterVolumeSpecName: "kube-api-access-l272r") pod "53b18647-af19-457c-9543-2156c1ace738" (UID: "53b18647-af19-457c-9543-2156c1ace738"). InnerVolumeSpecName "kube-api-access-l272r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.402824 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "53b18647-af19-457c-9543-2156c1ace738" (UID: "53b18647-af19-457c-9543-2156c1ace738"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.402916 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "53b18647-af19-457c-9543-2156c1ace738" (UID: "53b18647-af19-457c-9543-2156c1ace738"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.481049 4778 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.481111 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.481132 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53b18647-af19-457c-9543-2156c1ace738-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.481154 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l272r\" (UniqueName: \"kubernetes.io/projected/53b18647-af19-457c-9543-2156c1ace738-kube-api-access-l272r\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.849535 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" event={"ID":"53b18647-af19-457c-9543-2156c1ace738","Type":"ContainerDied","Data":"0fee9be56247863c641ddf6eb6613d75b7610950defcf4b488a7e3467f580b16"} Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.849584 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fee9be56247863c641ddf6eb6613d75b7610950defcf4b488a7e3467f580b16" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.849597 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j74ts" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.946837 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8"] Mar 18 09:43:00 crc kubenswrapper[4778]: E0318 09:43:00.947280 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b18647-af19-457c-9543-2156c1ace738" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.947304 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b18647-af19-457c-9543-2156c1ace738" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.947608 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b18647-af19-457c-9543-2156c1ace738" containerName="ssh-known-hosts-edpm-deployment" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.948886 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.953653 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.953924 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.954076 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.955362 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.955435 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.963646 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8"] Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.991588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.992148 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.992435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:00 crc kubenswrapper[4778]: I0318 09:43:00.992715 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr96f\" (UniqueName: \"kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.100532 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.100655 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.100706 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.102618 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr96f\" (UniqueName: \"kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.111086 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.120793 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.125576 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr96f\" (UniqueName: \"kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.135854 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9bss8\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.275834 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.806483 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8"] Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.813364 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:43:01 crc kubenswrapper[4778]: I0318 09:43:01.859904 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" event={"ID":"80a8d263-9bba-4db0-928e-f633b4ad5314","Type":"ContainerStarted","Data":"4183fc257c60cba8437e4d2ec61ceb6c06c87ebd92360ab71f51f3813dcba927"} Mar 18 09:43:02 crc kubenswrapper[4778]: I0318 09:43:02.871252 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" event={"ID":"80a8d263-9bba-4db0-928e-f633b4ad5314","Type":"ContainerStarted","Data":"45f5d946b0edf5c89ec37724c8001aa3f17b98318f01aa4199786bc97f369fdf"} Mar 18 09:43:02 crc kubenswrapper[4778]: I0318 09:43:02.902928 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" podStartSLOduration=2.293936868 podStartE2EDuration="2.902900749s" podCreationTimestamp="2026-03-18 09:43:00 +0000 UTC" firstStartedPulling="2026-03-18 09:43:01.813166722 +0000 UTC m=+2448.387911562" lastFinishedPulling="2026-03-18 09:43:02.422130573 +0000 UTC m=+2448.996875443" observedRunningTime="2026-03-18 09:43:02.894820189 +0000 UTC m=+2449.469565069" watchObservedRunningTime="2026-03-18 09:43:02.902900749 +0000 UTC m=+2449.477645619" Mar 18 09:43:09 crc kubenswrapper[4778]: I0318 09:43:09.937651 4778 generic.go:334] "Generic (PLEG): container finished" podID="80a8d263-9bba-4db0-928e-f633b4ad5314" containerID="45f5d946b0edf5c89ec37724c8001aa3f17b98318f01aa4199786bc97f369fdf" exitCode=0 Mar 18 09:43:09 crc kubenswrapper[4778]: I0318 09:43:09.937772 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" event={"ID":"80a8d263-9bba-4db0-928e-f633b4ad5314","Type":"ContainerDied","Data":"45f5d946b0edf5c89ec37724c8001aa3f17b98318f01aa4199786bc97f369fdf"} Mar 18 09:43:10 crc kubenswrapper[4778]: I0318 09:43:10.189360 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:43:10 crc kubenswrapper[4778]: E0318 09:43:10.189901 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.386112 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.425987 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr96f\" (UniqueName: \"kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f\") pod \"80a8d263-9bba-4db0-928e-f633b4ad5314\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.426273 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam\") pod \"80a8d263-9bba-4db0-928e-f633b4ad5314\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.426365 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory\") pod \"80a8d263-9bba-4db0-928e-f633b4ad5314\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.426419 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph\") pod \"80a8d263-9bba-4db0-928e-f633b4ad5314\" (UID: \"80a8d263-9bba-4db0-928e-f633b4ad5314\") " Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.433305 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f" (OuterVolumeSpecName: "kube-api-access-lr96f") pod "80a8d263-9bba-4db0-928e-f633b4ad5314" (UID: "80a8d263-9bba-4db0-928e-f633b4ad5314"). InnerVolumeSpecName "kube-api-access-lr96f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.433350 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph" (OuterVolumeSpecName: "ceph") pod "80a8d263-9bba-4db0-928e-f633b4ad5314" (UID: "80a8d263-9bba-4db0-928e-f633b4ad5314"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.455110 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory" (OuterVolumeSpecName: "inventory") pod "80a8d263-9bba-4db0-928e-f633b4ad5314" (UID: "80a8d263-9bba-4db0-928e-f633b4ad5314"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.456022 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "80a8d263-9bba-4db0-928e-f633b4ad5314" (UID: "80a8d263-9bba-4db0-928e-f633b4ad5314"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.528372 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr96f\" (UniqueName: \"kubernetes.io/projected/80a8d263-9bba-4db0-928e-f633b4ad5314-kube-api-access-lr96f\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.528419 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.528436 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.528450 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80a8d263-9bba-4db0-928e-f633b4ad5314-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.965741 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" event={"ID":"80a8d263-9bba-4db0-928e-f633b4ad5314","Type":"ContainerDied","Data":"4183fc257c60cba8437e4d2ec61ceb6c06c87ebd92360ab71f51f3813dcba927"} Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.966320 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4183fc257c60cba8437e4d2ec61ceb6c06c87ebd92360ab71f51f3813dcba927" Mar 18 09:43:11 crc kubenswrapper[4778]: I0318 09:43:11.966050 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9bss8" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.054298 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7"] Mar 18 09:43:12 crc kubenswrapper[4778]: E0318 09:43:12.054879 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a8d263-9bba-4db0-928e-f633b4ad5314" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.054905 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a8d263-9bba-4db0-928e-f633b4ad5314" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.055423 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a8d263-9bba-4db0-928e-f633b4ad5314" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.056401 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.059915 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.060057 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.060094 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.060129 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.060311 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.067386 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7"] Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.144795 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.145171 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwll\" (UniqueName: \"kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.145294 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.145401 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.248761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.249128 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwll\" (UniqueName: \"kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.249189 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.250362 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.255269 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.255994 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.262886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.267177 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwll\" (UniqueName: \"kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.380267 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.940755 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7"] Mar 18 09:43:12 crc kubenswrapper[4778]: I0318 09:43:12.975400 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" event={"ID":"613d0a31-a371-4c66-8254-85a7cc864fd0","Type":"ContainerStarted","Data":"5f314f77b9dfdc3bdb927436906d3194034e3084dc373fdaea997b04d7c042ff"} Mar 18 09:43:13 crc kubenswrapper[4778]: I0318 09:43:13.985924 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" event={"ID":"613d0a31-a371-4c66-8254-85a7cc864fd0","Type":"ContainerStarted","Data":"2de490467d479cfa037c02d250eacac3f30e179767b6a79d7df1d0233e63150e"} Mar 18 09:43:14 crc kubenswrapper[4778]: I0318 09:43:14.000959 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" podStartSLOduration=1.4068783790000001 podStartE2EDuration="2.000940136s" podCreationTimestamp="2026-03-18 09:43:12 +0000 UTC" firstStartedPulling="2026-03-18 09:43:12.939451117 +0000 UTC m=+2459.514195957" lastFinishedPulling="2026-03-18 09:43:13.533512834 +0000 UTC m=+2460.108257714" observedRunningTime="2026-03-18 09:43:13.999215109 +0000 UTC m=+2460.573959969" watchObservedRunningTime="2026-03-18 09:43:14.000940136 +0000 UTC m=+2460.575684976" Mar 18 09:43:23 crc kubenswrapper[4778]: I0318 09:43:23.075680 4778 generic.go:334] "Generic (PLEG): container finished" podID="613d0a31-a371-4c66-8254-85a7cc864fd0" containerID="2de490467d479cfa037c02d250eacac3f30e179767b6a79d7df1d0233e63150e" exitCode=0 Mar 18 09:43:23 crc kubenswrapper[4778]: I0318 09:43:23.075790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" event={"ID":"613d0a31-a371-4c66-8254-85a7cc864fd0","Type":"ContainerDied","Data":"2de490467d479cfa037c02d250eacac3f30e179767b6a79d7df1d0233e63150e"} Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.201519 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:43:24 crc kubenswrapper[4778]: E0318 09:43:24.202036 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.537415 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.711245 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam\") pod \"613d0a31-a371-4c66-8254-85a7cc864fd0\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.711325 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph\") pod \"613d0a31-a371-4c66-8254-85a7cc864fd0\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.711356 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory\") pod \"613d0a31-a371-4c66-8254-85a7cc864fd0\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.711478 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzwll\" (UniqueName: \"kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll\") pod \"613d0a31-a371-4c66-8254-85a7cc864fd0\" (UID: \"613d0a31-a371-4c66-8254-85a7cc864fd0\") " Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.722458 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph" (OuterVolumeSpecName: "ceph") pod "613d0a31-a371-4c66-8254-85a7cc864fd0" (UID: "613d0a31-a371-4c66-8254-85a7cc864fd0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.723414 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll" (OuterVolumeSpecName: "kube-api-access-gzwll") pod "613d0a31-a371-4c66-8254-85a7cc864fd0" (UID: "613d0a31-a371-4c66-8254-85a7cc864fd0"). InnerVolumeSpecName "kube-api-access-gzwll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.739876 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory" (OuterVolumeSpecName: "inventory") pod "613d0a31-a371-4c66-8254-85a7cc864fd0" (UID: "613d0a31-a371-4c66-8254-85a7cc864fd0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.759799 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "613d0a31-a371-4c66-8254-85a7cc864fd0" (UID: "613d0a31-a371-4c66-8254-85a7cc864fd0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.814641 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.814697 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.814708 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/613d0a31-a371-4c66-8254-85a7cc864fd0-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:24 crc kubenswrapper[4778]: I0318 09:43:24.814719 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzwll\" (UniqueName: \"kubernetes.io/projected/613d0a31-a371-4c66-8254-85a7cc864fd0-kube-api-access-gzwll\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.111704 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" event={"ID":"613d0a31-a371-4c66-8254-85a7cc864fd0","Type":"ContainerDied","Data":"5f314f77b9dfdc3bdb927436906d3194034e3084dc373fdaea997b04d7c042ff"} Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.112253 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f314f77b9dfdc3bdb927436906d3194034e3084dc373fdaea997b04d7c042ff" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.111879 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.241629 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd"] Mar 18 09:43:25 crc kubenswrapper[4778]: E0318 09:43:25.242130 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613d0a31-a371-4c66-8254-85a7cc864fd0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.242148 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="613d0a31-a371-4c66-8254-85a7cc864fd0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.242400 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="613d0a31-a371-4c66-8254-85a7cc864fd0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.243287 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.247142 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.247818 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.248272 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.253092 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.254103 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.254835 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.255234 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.255363 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.261807 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd"] Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.429785 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.430566 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.430902 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.431166 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.431413 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.431628 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.431866 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432259 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432439 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432637 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432825 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.432988 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.535413 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.535883 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536130 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536297 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536414 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536538 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536688 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.536917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.537049 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.537157 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.537334 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.537496 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.540152 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.541070 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.541690 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.544230 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.544865 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.546535 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.546676 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.547401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.548547 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.548924 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.550538 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.556110 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.568940 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfphd\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:25 crc kubenswrapper[4778]: I0318 09:43:25.863523 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:26 crc kubenswrapper[4778]: I0318 09:43:26.442983 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd"] Mar 18 09:43:27 crc kubenswrapper[4778]: I0318 09:43:27.134467 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" event={"ID":"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5","Type":"ContainerStarted","Data":"c2db1237b9f4739f87209ae21a67408657e301817a1a279b85ac382dd5fae289"} Mar 18 09:43:28 crc kubenswrapper[4778]: I0318 09:43:28.147874 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" event={"ID":"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5","Type":"ContainerStarted","Data":"b744a025e93192ec362ad8429af879e71a000131cc9b2858681fa40afa9f7623"} Mar 18 09:43:28 crc kubenswrapper[4778]: I0318 09:43:28.193473 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" podStartSLOduration=2.722591207 podStartE2EDuration="3.193440902s" podCreationTimestamp="2026-03-18 09:43:25 +0000 UTC" firstStartedPulling="2026-03-18 09:43:26.446518241 +0000 UTC m=+2473.021263081" lastFinishedPulling="2026-03-18 09:43:26.917367926 +0000 UTC m=+2473.492112776" observedRunningTime="2026-03-18 09:43:28.181911979 +0000 UTC m=+2474.756656889" watchObservedRunningTime="2026-03-18 09:43:28.193440902 +0000 UTC m=+2474.768185782" Mar 18 09:43:38 crc kubenswrapper[4778]: I0318 09:43:38.187561 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:43:38 crc kubenswrapper[4778]: E0318 09:43:38.188874 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:43:51 crc kubenswrapper[4778]: I0318 09:43:51.187031 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:43:51 crc kubenswrapper[4778]: E0318 09:43:51.187984 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:43:56 crc kubenswrapper[4778]: I0318 09:43:56.422767 4778 generic.go:334] "Generic (PLEG): container finished" podID="7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" containerID="b744a025e93192ec362ad8429af879e71a000131cc9b2858681fa40afa9f7623" exitCode=0 Mar 18 09:43:56 crc kubenswrapper[4778]: I0318 09:43:56.422837 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" event={"ID":"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5","Type":"ContainerDied","Data":"b744a025e93192ec362ad8429af879e71a000131cc9b2858681fa40afa9f7623"} Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.833827 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.891946 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892118 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892163 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892219 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892260 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892286 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892306 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892347 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892369 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892424 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892449 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892474 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.892571 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle\") pod \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\" (UID: \"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5\") " Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.898968 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6" (OuterVolumeSpecName: "kube-api-access-66rh6") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "kube-api-access-66rh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.899302 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.899783 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.899782 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.900559 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.901189 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.901394 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.901720 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph" (OuterVolumeSpecName: "ceph") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.902304 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.903614 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.907641 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.934403 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.937993 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory" (OuterVolumeSpecName: "inventory") pod "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" (UID: "7c70009e-cfb3-4598-9ae4-f1d90a2a63d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994785 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994831 4778 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994846 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994860 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994874 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994886 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66rh6\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-kube-api-access-66rh6\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994899 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994912 4778 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994924 4778 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994936 4778 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994948 4778 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994960 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:57 crc kubenswrapper[4778]: I0318 09:43:57.994973 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7c70009e-cfb3-4598-9ae4-f1d90a2a63d5-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.447390 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" event={"ID":"7c70009e-cfb3-4598-9ae4-f1d90a2a63d5","Type":"ContainerDied","Data":"c2db1237b9f4739f87209ae21a67408657e301817a1a279b85ac382dd5fae289"} Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.447450 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2db1237b9f4739f87209ae21a67408657e301817a1a279b85ac382dd5fae289" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.447519 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfphd" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.566229 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv"] Mar 18 09:43:58 crc kubenswrapper[4778]: E0318 09:43:58.566949 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.567094 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.567434 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c70009e-cfb3-4598-9ae4-f1d90a2a63d5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.568249 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.573474 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.573474 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.574174 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.574449 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.574664 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.589483 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv"] Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.710906 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.711126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.711337 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.711421 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzp4p\" (UniqueName: \"kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.814258 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.814434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.814517 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzp4p\" (UniqueName: \"kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.814579 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.820716 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.820754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.824614 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.833181 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzp4p\" (UniqueName: \"kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:58 crc kubenswrapper[4778]: I0318 09:43:58.910174 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:43:59 crc kubenswrapper[4778]: I0318 09:43:59.532665 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv"] Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.142152 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563784-pdds9"] Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.143632 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.145660 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.145784 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.145979 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.157679 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563784-pdds9"] Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.242367 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqkzb\" (UniqueName: \"kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb\") pod \"auto-csr-approver-29563784-pdds9\" (UID: \"ab4f60ce-be48-4052-9fa7-905b70e65c3a\") " pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.343877 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqkzb\" (UniqueName: \"kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb\") pod \"auto-csr-approver-29563784-pdds9\" (UID: \"ab4f60ce-be48-4052-9fa7-905b70e65c3a\") " pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.373647 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqkzb\" (UniqueName: \"kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb\") pod \"auto-csr-approver-29563784-pdds9\" (UID: \"ab4f60ce-be48-4052-9fa7-905b70e65c3a\") " pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.464823 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.470105 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" event={"ID":"fed5a515-ed14-40f1-9282-4e87fe319bf6","Type":"ContainerStarted","Data":"2804ae64e5a590f30623aba762efec2399df5cd3bd84f7d7167e025650c8d12c"} Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.470142 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" event={"ID":"fed5a515-ed14-40f1-9282-4e87fe319bf6","Type":"ContainerStarted","Data":"96de5a1856c48d9ae2a5c28917debb3937af347a9bc8b2632de45457e8977720"} Mar 18 09:44:00 crc kubenswrapper[4778]: I0318 09:44:00.511864 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" podStartSLOduration=2.036255582 podStartE2EDuration="2.511838647s" podCreationTimestamp="2026-03-18 09:43:58 +0000 UTC" firstStartedPulling="2026-03-18 09:43:59.536667705 +0000 UTC m=+2506.111412545" lastFinishedPulling="2026-03-18 09:44:00.01225077 +0000 UTC m=+2506.586995610" observedRunningTime="2026-03-18 09:44:00.497082616 +0000 UTC m=+2507.071827536" watchObservedRunningTime="2026-03-18 09:44:00.511838647 +0000 UTC m=+2507.086583497" Mar 18 09:44:00 crc kubenswrapper[4778]: W0318 09:44:00.999463 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab4f60ce_be48_4052_9fa7_905b70e65c3a.slice/crio-4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f WatchSource:0}: Error finding container 4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f: Status 404 returned error can't find the container with id 4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f Mar 18 09:44:01 crc kubenswrapper[4778]: I0318 09:44:01.000620 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563784-pdds9"] Mar 18 09:44:01 crc kubenswrapper[4778]: I0318 09:44:01.480181 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563784-pdds9" event={"ID":"ab4f60ce-be48-4052-9fa7-905b70e65c3a","Type":"ContainerStarted","Data":"4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f"} Mar 18 09:44:02 crc kubenswrapper[4778]: I0318 09:44:02.488974 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563784-pdds9" event={"ID":"ab4f60ce-be48-4052-9fa7-905b70e65c3a","Type":"ContainerStarted","Data":"8f06326ea2269b974cec86640f654b1a9c686fe2bd62106254f2a194a59e658e"} Mar 18 09:44:02 crc kubenswrapper[4778]: I0318 09:44:02.506937 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563784-pdds9" podStartSLOduration=1.433092322 podStartE2EDuration="2.506920836s" podCreationTimestamp="2026-03-18 09:44:00 +0000 UTC" firstStartedPulling="2026-03-18 09:44:01.003299023 +0000 UTC m=+2507.578043873" lastFinishedPulling="2026-03-18 09:44:02.077127527 +0000 UTC m=+2508.651872387" observedRunningTime="2026-03-18 09:44:02.501806937 +0000 UTC m=+2509.076551787" watchObservedRunningTime="2026-03-18 09:44:02.506920836 +0000 UTC m=+2509.081665676" Mar 18 09:44:03 crc kubenswrapper[4778]: I0318 09:44:03.497951 4778 generic.go:334] "Generic (PLEG): container finished" podID="ab4f60ce-be48-4052-9fa7-905b70e65c3a" containerID="8f06326ea2269b974cec86640f654b1a9c686fe2bd62106254f2a194a59e658e" exitCode=0 Mar 18 09:44:03 crc kubenswrapper[4778]: I0318 09:44:03.498063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563784-pdds9" event={"ID":"ab4f60ce-be48-4052-9fa7-905b70e65c3a","Type":"ContainerDied","Data":"8f06326ea2269b974cec86640f654b1a9c686fe2bd62106254f2a194a59e658e"} Mar 18 09:44:04 crc kubenswrapper[4778]: I0318 09:44:04.874049 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:04 crc kubenswrapper[4778]: I0318 09:44:04.971685 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqkzb\" (UniqueName: \"kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb\") pod \"ab4f60ce-be48-4052-9fa7-905b70e65c3a\" (UID: \"ab4f60ce-be48-4052-9fa7-905b70e65c3a\") " Mar 18 09:44:04 crc kubenswrapper[4778]: I0318 09:44:04.978642 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb" (OuterVolumeSpecName: "kube-api-access-nqkzb") pod "ab4f60ce-be48-4052-9fa7-905b70e65c3a" (UID: "ab4f60ce-be48-4052-9fa7-905b70e65c3a"). InnerVolumeSpecName "kube-api-access-nqkzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.074380 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqkzb\" (UniqueName: \"kubernetes.io/projected/ab4f60ce-be48-4052-9fa7-905b70e65c3a-kube-api-access-nqkzb\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.528679 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563784-pdds9" event={"ID":"ab4f60ce-be48-4052-9fa7-905b70e65c3a","Type":"ContainerDied","Data":"4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f"} Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.529178 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db45ae02348a6f3c6a84ec8f999c7bd96e6ba2c870cdd5ed8bf4e627f0f149f" Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.529344 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563784-pdds9" Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.536375 4778 generic.go:334] "Generic (PLEG): container finished" podID="fed5a515-ed14-40f1-9282-4e87fe319bf6" containerID="2804ae64e5a590f30623aba762efec2399df5cd3bd84f7d7167e025650c8d12c" exitCode=0 Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.536457 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" event={"ID":"fed5a515-ed14-40f1-9282-4e87fe319bf6","Type":"ContainerDied","Data":"2804ae64e5a590f30623aba762efec2399df5cd3bd84f7d7167e025650c8d12c"} Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.605493 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563778-4dm5p"] Mar 18 09:44:05 crc kubenswrapper[4778]: I0318 09:44:05.617966 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563778-4dm5p"] Mar 18 09:44:06 crc kubenswrapper[4778]: I0318 09:44:06.187614 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:44:06 crc kubenswrapper[4778]: E0318 09:44:06.188240 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:44:06 crc kubenswrapper[4778]: I0318 09:44:06.198831 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0548485b-4f03-47ba-8a13-4e3522451291" path="/var/lib/kubelet/pods/0548485b-4f03-47ba-8a13-4e3522451291/volumes" Mar 18 09:44:06 crc kubenswrapper[4778]: I0318 09:44:06.904039 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.011784 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory\") pod \"fed5a515-ed14-40f1-9282-4e87fe319bf6\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.011900 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzp4p\" (UniqueName: \"kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p\") pod \"fed5a515-ed14-40f1-9282-4e87fe319bf6\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.011930 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph\") pod \"fed5a515-ed14-40f1-9282-4e87fe319bf6\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.012077 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam\") pod \"fed5a515-ed14-40f1-9282-4e87fe319bf6\" (UID: \"fed5a515-ed14-40f1-9282-4e87fe319bf6\") " Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.017763 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph" (OuterVolumeSpecName: "ceph") pod "fed5a515-ed14-40f1-9282-4e87fe319bf6" (UID: "fed5a515-ed14-40f1-9282-4e87fe319bf6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.023595 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p" (OuterVolumeSpecName: "kube-api-access-bzp4p") pod "fed5a515-ed14-40f1-9282-4e87fe319bf6" (UID: "fed5a515-ed14-40f1-9282-4e87fe319bf6"). InnerVolumeSpecName "kube-api-access-bzp4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.042863 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fed5a515-ed14-40f1-9282-4e87fe319bf6" (UID: "fed5a515-ed14-40f1-9282-4e87fe319bf6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.056217 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory" (OuterVolumeSpecName: "inventory") pod "fed5a515-ed14-40f1-9282-4e87fe319bf6" (UID: "fed5a515-ed14-40f1-9282-4e87fe319bf6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.114714 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.114755 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzp4p\" (UniqueName: \"kubernetes.io/projected/fed5a515-ed14-40f1-9282-4e87fe319bf6-kube-api-access-bzp4p\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.114770 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.114782 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed5a515-ed14-40f1-9282-4e87fe319bf6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.559547 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" event={"ID":"fed5a515-ed14-40f1-9282-4e87fe319bf6","Type":"ContainerDied","Data":"96de5a1856c48d9ae2a5c28917debb3937af347a9bc8b2632de45457e8977720"} Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.559606 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96de5a1856c48d9ae2a5c28917debb3937af347a9bc8b2632de45457e8977720" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.559678 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.662538 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd"] Mar 18 09:44:07 crc kubenswrapper[4778]: E0318 09:44:07.662997 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed5a515-ed14-40f1-9282-4e87fe319bf6" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.663025 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed5a515-ed14-40f1-9282-4e87fe319bf6" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 18 09:44:07 crc kubenswrapper[4778]: E0318 09:44:07.663040 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4f60ce-be48-4052-9fa7-905b70e65c3a" containerName="oc" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.663049 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4f60ce-be48-4052-9fa7-905b70e65c3a" containerName="oc" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.663296 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed5a515-ed14-40f1-9282-4e87fe319bf6" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.663330 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab4f60ce-be48-4052-9fa7-905b70e65c3a" containerName="oc" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.665126 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.667596 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.667910 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.670414 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.678661 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.678704 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.678661 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.694329 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd"] Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.826588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8df\" (UniqueName: \"kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.826682 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.826778 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.826815 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.826847 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.827025 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.929357 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.929478 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.929567 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.930585 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.930710 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8df\" (UniqueName: \"kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.930768 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.932459 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.935819 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.936324 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.936694 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.938455 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.966454 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8df\" (UniqueName: \"kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7jqhd\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:07 crc kubenswrapper[4778]: I0318 09:44:07.994448 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:44:08 crc kubenswrapper[4778]: I0318 09:44:08.564888 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd"] Mar 18 09:44:09 crc kubenswrapper[4778]: I0318 09:44:09.577317 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" event={"ID":"1f0f4177-ad12-4848-bbd7-39b004344cb3","Type":"ContainerStarted","Data":"c96da0fdc9f23d1c8174300e8944755e5546994203de0c9b38e19a45beb705b3"} Mar 18 09:44:09 crc kubenswrapper[4778]: I0318 09:44:09.578719 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" event={"ID":"1f0f4177-ad12-4848-bbd7-39b004344cb3","Type":"ContainerStarted","Data":"54b76851245c233c2784f282b1ca1eb9cfa025c851c32932417d057083ffca1c"} Mar 18 09:44:09 crc kubenswrapper[4778]: I0318 09:44:09.607622 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" podStartSLOduration=2.10382655 podStartE2EDuration="2.607606851s" podCreationTimestamp="2026-03-18 09:44:07 +0000 UTC" firstStartedPulling="2026-03-18 09:44:08.572044948 +0000 UTC m=+2515.146789798" lastFinishedPulling="2026-03-18 09:44:09.075825249 +0000 UTC m=+2515.650570099" observedRunningTime="2026-03-18 09:44:09.605831124 +0000 UTC m=+2516.180576004" watchObservedRunningTime="2026-03-18 09:44:09.607606851 +0000 UTC m=+2516.182351681" Mar 18 09:44:20 crc kubenswrapper[4778]: I0318 09:44:20.188078 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:44:20 crc kubenswrapper[4778]: E0318 09:44:20.189004 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:44:31 crc kubenswrapper[4778]: I0318 09:44:31.187121 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:44:31 crc kubenswrapper[4778]: E0318 09:44:31.188270 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:44:32 crc kubenswrapper[4778]: I0318 09:44:32.328588 4778 scope.go:117] "RemoveContainer" containerID="f6adcd9d5f24124681eed0d00263f7ac4a19be40ad724c067b9849cb1ce141e4" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.724554 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.729639 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.736564 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.826430 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.826495 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.826665 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxnqx\" (UniqueName: \"kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.928459 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxnqx\" (UniqueName: \"kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.928644 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.928668 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.929374 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.929506 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:34 crc kubenswrapper[4778]: I0318 09:44:34.952939 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxnqx\" (UniqueName: \"kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx\") pod \"certified-operators-ftnlp\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:35 crc kubenswrapper[4778]: I0318 09:44:35.049436 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:35 crc kubenswrapper[4778]: I0318 09:44:35.551823 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:35 crc kubenswrapper[4778]: I0318 09:44:35.830673 4778 generic.go:334] "Generic (PLEG): container finished" podID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerID="ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e" exitCode=0 Mar 18 09:44:35 crc kubenswrapper[4778]: I0318 09:44:35.830789 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerDied","Data":"ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e"} Mar 18 09:44:35 crc kubenswrapper[4778]: I0318 09:44:35.831027 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerStarted","Data":"3e1b274b36934e80c0a1d5e469bbaee10925f2146c30a4ddf513987aa5e061ef"} Mar 18 09:44:36 crc kubenswrapper[4778]: I0318 09:44:36.844458 4778 generic.go:334] "Generic (PLEG): container finished" podID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerID="f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900" exitCode=0 Mar 18 09:44:36 crc kubenswrapper[4778]: I0318 09:44:36.844559 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerDied","Data":"f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900"} Mar 18 09:44:37 crc kubenswrapper[4778]: I0318 09:44:37.878620 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerStarted","Data":"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c"} Mar 18 09:44:37 crc kubenswrapper[4778]: I0318 09:44:37.903673 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ftnlp" podStartSLOduration=2.439522425 podStartE2EDuration="3.903654503s" podCreationTimestamp="2026-03-18 09:44:34 +0000 UTC" firstStartedPulling="2026-03-18 09:44:35.832210688 +0000 UTC m=+2542.406955528" lastFinishedPulling="2026-03-18 09:44:37.296342766 +0000 UTC m=+2543.871087606" observedRunningTime="2026-03-18 09:44:37.899398527 +0000 UTC m=+2544.474143367" watchObservedRunningTime="2026-03-18 09:44:37.903654503 +0000 UTC m=+2544.478399343" Mar 18 09:44:42 crc kubenswrapper[4778]: I0318 09:44:42.189317 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:44:42 crc kubenswrapper[4778]: E0318 09:44:42.190599 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:44:45 crc kubenswrapper[4778]: I0318 09:44:45.050252 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:45 crc kubenswrapper[4778]: I0318 09:44:45.050560 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:45 crc kubenswrapper[4778]: I0318 09:44:45.125364 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:46 crc kubenswrapper[4778]: I0318 09:44:46.029304 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:46 crc kubenswrapper[4778]: I0318 09:44:46.070719 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:47 crc kubenswrapper[4778]: I0318 09:44:47.998112 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ftnlp" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="registry-server" containerID="cri-o://79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c" gracePeriod=2 Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.465443 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.590166 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxnqx\" (UniqueName: \"kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx\") pod \"76a702d2-54ab-444c-bf6c-cc815acef4d7\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.590564 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content\") pod \"76a702d2-54ab-444c-bf6c-cc815acef4d7\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.590801 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities\") pod \"76a702d2-54ab-444c-bf6c-cc815acef4d7\" (UID: \"76a702d2-54ab-444c-bf6c-cc815acef4d7\") " Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.592491 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities" (OuterVolumeSpecName: "utilities") pod "76a702d2-54ab-444c-bf6c-cc815acef4d7" (UID: "76a702d2-54ab-444c-bf6c-cc815acef4d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.595006 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx" (OuterVolumeSpecName: "kube-api-access-vxnqx") pod "76a702d2-54ab-444c-bf6c-cc815acef4d7" (UID: "76a702d2-54ab-444c-bf6c-cc815acef4d7"). InnerVolumeSpecName "kube-api-access-vxnqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.665171 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76a702d2-54ab-444c-bf6c-cc815acef4d7" (UID: "76a702d2-54ab-444c-bf6c-cc815acef4d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.693533 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.693571 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxnqx\" (UniqueName: \"kubernetes.io/projected/76a702d2-54ab-444c-bf6c-cc815acef4d7-kube-api-access-vxnqx\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:48 crc kubenswrapper[4778]: I0318 09:44:48.693588 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a702d2-54ab-444c-bf6c-cc815acef4d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.028110 4778 generic.go:334] "Generic (PLEG): container finished" podID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerID="79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c" exitCode=0 Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.028151 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerDied","Data":"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c"} Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.028179 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftnlp" event={"ID":"76a702d2-54ab-444c-bf6c-cc815acef4d7","Type":"ContainerDied","Data":"3e1b274b36934e80c0a1d5e469bbaee10925f2146c30a4ddf513987aa5e061ef"} Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.028179 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftnlp" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.028248 4778 scope.go:117] "RemoveContainer" containerID="79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.061334 4778 scope.go:117] "RemoveContainer" containerID="f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.068222 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.077214 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ftnlp"] Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.086067 4778 scope.go:117] "RemoveContainer" containerID="ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.121938 4778 scope.go:117] "RemoveContainer" containerID="79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c" Mar 18 09:44:49 crc kubenswrapper[4778]: E0318 09:44:49.122353 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c\": container with ID starting with 79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c not found: ID does not exist" containerID="79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.122406 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c"} err="failed to get container status \"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c\": rpc error: code = NotFound desc = could not find container \"79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c\": container with ID starting with 79df76cf9e63540ae84c61593da0771f7cc48d7ebdc6a42eca68c018d736604c not found: ID does not exist" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.122436 4778 scope.go:117] "RemoveContainer" containerID="f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900" Mar 18 09:44:49 crc kubenswrapper[4778]: E0318 09:44:49.122767 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900\": container with ID starting with f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900 not found: ID does not exist" containerID="f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.122804 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900"} err="failed to get container status \"f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900\": rpc error: code = NotFound desc = could not find container \"f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900\": container with ID starting with f2069345ce9dc9c72228fc6f3cb0ce32061bfdff257519deb07ce8ca396ae900 not found: ID does not exist" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.122827 4778 scope.go:117] "RemoveContainer" containerID="ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e" Mar 18 09:44:49 crc kubenswrapper[4778]: E0318 09:44:49.123095 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e\": container with ID starting with ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e not found: ID does not exist" containerID="ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e" Mar 18 09:44:49 crc kubenswrapper[4778]: I0318 09:44:49.123124 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e"} err="failed to get container status \"ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e\": rpc error: code = NotFound desc = could not find container \"ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e\": container with ID starting with ee2ac484bea0833bf3482d8f2411a74b12238508bfddc92019f577085a5bbb3e not found: ID does not exist" Mar 18 09:44:50 crc kubenswrapper[4778]: I0318 09:44:50.207455 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" path="/var/lib/kubelet/pods/76a702d2-54ab-444c-bf6c-cc815acef4d7/volumes" Mar 18 09:44:53 crc kubenswrapper[4778]: I0318 09:44:53.188076 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:44:53 crc kubenswrapper[4778]: E0318 09:44:53.189242 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.172532 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq"] Mar 18 09:45:00 crc kubenswrapper[4778]: E0318 09:45:00.173415 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="extract-utilities" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.173431 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="extract-utilities" Mar 18 09:45:00 crc kubenswrapper[4778]: E0318 09:45:00.173455 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="registry-server" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.173463 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="registry-server" Mar 18 09:45:00 crc kubenswrapper[4778]: E0318 09:45:00.173492 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="extract-content" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.173499 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="extract-content" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.173717 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a702d2-54ab-444c-bf6c-cc815acef4d7" containerName="registry-server" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.174520 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.177250 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.184466 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.204342 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq"] Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.261168 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.261427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxkdc\" (UniqueName: \"kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.261455 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.362734 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxkdc\" (UniqueName: \"kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.362778 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.362809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.364223 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.367883 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.381995 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxkdc\" (UniqueName: \"kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc\") pod \"collect-profiles-29563785-ptknq\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.512733 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:00 crc kubenswrapper[4778]: I0318 09:45:00.950878 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq"] Mar 18 09:45:01 crc kubenswrapper[4778]: I0318 09:45:01.183328 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" event={"ID":"956ed194-df94-4b74-919f-9cdcfbdcf5a7","Type":"ContainerStarted","Data":"b6a6fd51a98d9937da03ae4682cc5b4ae715e8495f9ae8fc3459feb811d9d2fc"} Mar 18 09:45:01 crc kubenswrapper[4778]: I0318 09:45:01.183658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" event={"ID":"956ed194-df94-4b74-919f-9cdcfbdcf5a7","Type":"ContainerStarted","Data":"50d5642833b7d2fab625e9ecbe8af9feca4613d9050cdea2f10325e0597cb421"} Mar 18 09:45:01 crc kubenswrapper[4778]: I0318 09:45:01.205375 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" podStartSLOduration=1.205356017 podStartE2EDuration="1.205356017s" podCreationTimestamp="2026-03-18 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:45:01.19775446 +0000 UTC m=+2567.772499310" watchObservedRunningTime="2026-03-18 09:45:01.205356017 +0000 UTC m=+2567.780100857" Mar 18 09:45:02 crc kubenswrapper[4778]: I0318 09:45:02.194395 4778 generic.go:334] "Generic (PLEG): container finished" podID="956ed194-df94-4b74-919f-9cdcfbdcf5a7" containerID="b6a6fd51a98d9937da03ae4682cc5b4ae715e8495f9ae8fc3459feb811d9d2fc" exitCode=0 Mar 18 09:45:02 crc kubenswrapper[4778]: I0318 09:45:02.197394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" event={"ID":"956ed194-df94-4b74-919f-9cdcfbdcf5a7","Type":"ContainerDied","Data":"b6a6fd51a98d9937da03ae4682cc5b4ae715e8495f9ae8fc3459feb811d9d2fc"} Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.586404 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.638994 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume\") pod \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.639240 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxkdc\" (UniqueName: \"kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc\") pod \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.639316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume\") pod \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\" (UID: \"956ed194-df94-4b74-919f-9cdcfbdcf5a7\") " Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.640346 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "956ed194-df94-4b74-919f-9cdcfbdcf5a7" (UID: "956ed194-df94-4b74-919f-9cdcfbdcf5a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.646756 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc" (OuterVolumeSpecName: "kube-api-access-hxkdc") pod "956ed194-df94-4b74-919f-9cdcfbdcf5a7" (UID: "956ed194-df94-4b74-919f-9cdcfbdcf5a7"). InnerVolumeSpecName "kube-api-access-hxkdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.648767 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "956ed194-df94-4b74-919f-9cdcfbdcf5a7" (UID: "956ed194-df94-4b74-919f-9cdcfbdcf5a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.741738 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxkdc\" (UniqueName: \"kubernetes.io/projected/956ed194-df94-4b74-919f-9cdcfbdcf5a7-kube-api-access-hxkdc\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.741778 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/956ed194-df94-4b74-919f-9cdcfbdcf5a7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:03 crc kubenswrapper[4778]: I0318 09:45:03.741788 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/956ed194-df94-4b74-919f-9cdcfbdcf5a7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.193825 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:45:04 crc kubenswrapper[4778]: E0318 09:45:04.194273 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.210353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" event={"ID":"956ed194-df94-4b74-919f-9cdcfbdcf5a7","Type":"ContainerDied","Data":"50d5642833b7d2fab625e9ecbe8af9feca4613d9050cdea2f10325e0597cb421"} Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.210520 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d5642833b7d2fab625e9ecbe8af9feca4613d9050cdea2f10325e0597cb421" Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.210390 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq" Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.268242 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl"] Mar 18 09:45:04 crc kubenswrapper[4778]: I0318 09:45:04.283266 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-wbvxl"] Mar 18 09:45:06 crc kubenswrapper[4778]: I0318 09:45:06.197615 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ee6937-a1a5-42ea-a460-29d54478e633" path="/var/lib/kubelet/pods/97ee6937-a1a5-42ea-a460-29d54478e633/volumes" Mar 18 09:45:15 crc kubenswrapper[4778]: I0318 09:45:15.189076 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:45:15 crc kubenswrapper[4778]: E0318 09:45:15.190380 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:45:16 crc kubenswrapper[4778]: I0318 09:45:16.338402 4778 generic.go:334] "Generic (PLEG): container finished" podID="1f0f4177-ad12-4848-bbd7-39b004344cb3" containerID="c96da0fdc9f23d1c8174300e8944755e5546994203de0c9b38e19a45beb705b3" exitCode=0 Mar 18 09:45:16 crc kubenswrapper[4778]: I0318 09:45:16.338723 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" event={"ID":"1f0f4177-ad12-4848-bbd7-39b004344cb3","Type":"ContainerDied","Data":"c96da0fdc9f23d1c8174300e8944755e5546994203de0c9b38e19a45beb705b3"} Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.747441 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824263 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824337 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8df\" (UniqueName: \"kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824401 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824447 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.824466 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph\") pod \"1f0f4177-ad12-4848-bbd7-39b004344cb3\" (UID: \"1f0f4177-ad12-4848-bbd7-39b004344cb3\") " Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.832493 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.832595 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df" (OuterVolumeSpecName: "kube-api-access-fc8df") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "kube-api-access-fc8df". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.832729 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph" (OuterVolumeSpecName: "ceph") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.854141 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.854291 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.863416 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory" (OuterVolumeSpecName: "inventory") pod "1f0f4177-ad12-4848-bbd7-39b004344cb3" (UID: "1f0f4177-ad12-4848-bbd7-39b004344cb3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926613 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926663 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926675 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8df\" (UniqueName: \"kubernetes.io/projected/1f0f4177-ad12-4848-bbd7-39b004344cb3-kube-api-access-fc8df\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926683 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926692 4778 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1f0f4177-ad12-4848-bbd7-39b004344cb3-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:17 crc kubenswrapper[4778]: I0318 09:45:17.926703 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1f0f4177-ad12-4848-bbd7-39b004344cb3-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.359516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" event={"ID":"1f0f4177-ad12-4848-bbd7-39b004344cb3","Type":"ContainerDied","Data":"54b76851245c233c2784f282b1ca1eb9cfa025c851c32932417d057083ffca1c"} Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.359861 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54b76851245c233c2784f282b1ca1eb9cfa025c851c32932417d057083ffca1c" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.359657 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7jqhd" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.527160 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v"] Mar 18 09:45:18 crc kubenswrapper[4778]: E0318 09:45:18.527531 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0f4177-ad12-4848-bbd7-39b004344cb3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.527548 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0f4177-ad12-4848-bbd7-39b004344cb3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 09:45:18 crc kubenswrapper[4778]: E0318 09:45:18.527584 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956ed194-df94-4b74-919f-9cdcfbdcf5a7" containerName="collect-profiles" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.527590 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="956ed194-df94-4b74-919f-9cdcfbdcf5a7" containerName="collect-profiles" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.527740 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="956ed194-df94-4b74-919f-9cdcfbdcf5a7" containerName="collect-profiles" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.527762 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0f4177-ad12-4848-bbd7-39b004344cb3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.528434 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.533448 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.533695 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.534347 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.534556 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.534714 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.534859 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.535019 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.538249 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v"] Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.639797 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.639921 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.640000 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.640028 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.640167 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.640245 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7p82\" (UniqueName: \"kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.640304 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742482 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742505 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742544 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742578 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7p82\" (UniqueName: \"kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.742625 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.747756 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.748398 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.748405 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.748918 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.749315 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.750331 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.758342 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7p82\" (UniqueName: \"kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:18 crc kubenswrapper[4778]: I0318 09:45:18.845421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:45:19 crc kubenswrapper[4778]: I0318 09:45:19.428280 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v"] Mar 18 09:45:20 crc kubenswrapper[4778]: I0318 09:45:20.381424 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" event={"ID":"52250b90-fbc6-418e-9a5f-4873d5fa5cd0","Type":"ContainerStarted","Data":"ac5f02de690c8a4d5294091531625f1900cc40e366d4cb6654150b4c7eb35d5d"} Mar 18 09:45:20 crc kubenswrapper[4778]: I0318 09:45:20.382171 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" event={"ID":"52250b90-fbc6-418e-9a5f-4873d5fa5cd0","Type":"ContainerStarted","Data":"4209540ec7b376466182558df0c5b4d7f8fd041732dba8b257e8fb43f4388585"} Mar 18 09:45:20 crc kubenswrapper[4778]: I0318 09:45:20.403643 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" podStartSLOduration=1.85240798 podStartE2EDuration="2.403610971s" podCreationTimestamp="2026-03-18 09:45:18 +0000 UTC" firstStartedPulling="2026-03-18 09:45:19.4258564 +0000 UTC m=+2586.000601250" lastFinishedPulling="2026-03-18 09:45:19.977059391 +0000 UTC m=+2586.551804241" observedRunningTime="2026-03-18 09:45:20.398902343 +0000 UTC m=+2586.973647243" watchObservedRunningTime="2026-03-18 09:45:20.403610971 +0000 UTC m=+2586.978355821" Mar 18 09:45:30 crc kubenswrapper[4778]: I0318 09:45:30.187370 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:45:30 crc kubenswrapper[4778]: E0318 09:45:30.188491 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:45:32 crc kubenswrapper[4778]: I0318 09:45:32.445543 4778 scope.go:117] "RemoveContainer" containerID="f1aaa8a2c1f96baaee4b7353f353a9b567635ea9eb73df19ffa50153f00a757d" Mar 18 09:45:41 crc kubenswrapper[4778]: I0318 09:45:41.187706 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:45:41 crc kubenswrapper[4778]: E0318 09:45:41.188469 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:45:54 crc kubenswrapper[4778]: I0318 09:45:54.192538 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:45:54 crc kubenswrapper[4778]: E0318 09:45:54.193294 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.162798 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563786-wpjmv"] Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.164534 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.168054 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.168389 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.168467 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.176644 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563786-wpjmv"] Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.304839 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v7zf\" (UniqueName: \"kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf\") pod \"auto-csr-approver-29563786-wpjmv\" (UID: \"e81f72c3-90fb-4526-97e3-977f3dbd00b0\") " pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.407948 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v7zf\" (UniqueName: \"kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf\") pod \"auto-csr-approver-29563786-wpjmv\" (UID: \"e81f72c3-90fb-4526-97e3-977f3dbd00b0\") " pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.437061 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v7zf\" (UniqueName: \"kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf\") pod \"auto-csr-approver-29563786-wpjmv\" (UID: \"e81f72c3-90fb-4526-97e3-977f3dbd00b0\") " pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:00 crc kubenswrapper[4778]: I0318 09:46:00.497326 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:01 crc kubenswrapper[4778]: I0318 09:46:01.033655 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563786-wpjmv"] Mar 18 09:46:01 crc kubenswrapper[4778]: I0318 09:46:01.772405 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" event={"ID":"e81f72c3-90fb-4526-97e3-977f3dbd00b0","Type":"ContainerStarted","Data":"ed3b9b80a3eaff39754a6a0e4347277ee13e77779ab27abb2a449815d81e2808"} Mar 18 09:46:02 crc kubenswrapper[4778]: I0318 09:46:02.783714 4778 generic.go:334] "Generic (PLEG): container finished" podID="e81f72c3-90fb-4526-97e3-977f3dbd00b0" containerID="f0336ddfd7a0dbb015d37a5f5151d0bd63e8c2d9a92eb6c0cfc48a0cb9420252" exitCode=0 Mar 18 09:46:02 crc kubenswrapper[4778]: I0318 09:46:02.783774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" event={"ID":"e81f72c3-90fb-4526-97e3-977f3dbd00b0","Type":"ContainerDied","Data":"f0336ddfd7a0dbb015d37a5f5151d0bd63e8c2d9a92eb6c0cfc48a0cb9420252"} Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.092929 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.278895 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v7zf\" (UniqueName: \"kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf\") pod \"e81f72c3-90fb-4526-97e3-977f3dbd00b0\" (UID: \"e81f72c3-90fb-4526-97e3-977f3dbd00b0\") " Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.287062 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf" (OuterVolumeSpecName: "kube-api-access-7v7zf") pod "e81f72c3-90fb-4526-97e3-977f3dbd00b0" (UID: "e81f72c3-90fb-4526-97e3-977f3dbd00b0"). InnerVolumeSpecName "kube-api-access-7v7zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.382212 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v7zf\" (UniqueName: \"kubernetes.io/projected/e81f72c3-90fb-4526-97e3-977f3dbd00b0-kube-api-access-7v7zf\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.803604 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" event={"ID":"e81f72c3-90fb-4526-97e3-977f3dbd00b0","Type":"ContainerDied","Data":"ed3b9b80a3eaff39754a6a0e4347277ee13e77779ab27abb2a449815d81e2808"} Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.804055 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed3b9b80a3eaff39754a6a0e4347277ee13e77779ab27abb2a449815d81e2808" Mar 18 09:46:04 crc kubenswrapper[4778]: I0318 09:46:04.804140 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563786-wpjmv" Mar 18 09:46:05 crc kubenswrapper[4778]: I0318 09:46:05.171740 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563780-vgggq"] Mar 18 09:46:05 crc kubenswrapper[4778]: I0318 09:46:05.181094 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563780-vgggq"] Mar 18 09:46:06 crc kubenswrapper[4778]: I0318 09:46:06.198130 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed393452-0d17-4c60-b37b-544b21c09da1" path="/var/lib/kubelet/pods/ed393452-0d17-4c60-b37b-544b21c09da1/volumes" Mar 18 09:46:08 crc kubenswrapper[4778]: I0318 09:46:08.187822 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:46:08 crc kubenswrapper[4778]: E0318 09:46:08.188259 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:46:14 crc kubenswrapper[4778]: I0318 09:46:14.893874 4778 generic.go:334] "Generic (PLEG): container finished" podID="52250b90-fbc6-418e-9a5f-4873d5fa5cd0" containerID="ac5f02de690c8a4d5294091531625f1900cc40e366d4cb6654150b4c7eb35d5d" exitCode=0 Mar 18 09:46:14 crc kubenswrapper[4778]: I0318 09:46:14.894094 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" event={"ID":"52250b90-fbc6-418e-9a5f-4873d5fa5cd0","Type":"ContainerDied","Data":"ac5f02de690c8a4d5294091531625f1900cc40e366d4cb6654150b4c7eb35d5d"} Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.378459 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.520592 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.520983 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.521099 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.521141 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.521248 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7p82\" (UniqueName: \"kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.521315 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.521391 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle\") pod \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\" (UID: \"52250b90-fbc6-418e-9a5f-4873d5fa5cd0\") " Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.526544 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.526689 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82" (OuterVolumeSpecName: "kube-api-access-n7p82") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "kube-api-access-n7p82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.527991 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph" (OuterVolumeSpecName: "ceph") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.554457 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory" (OuterVolumeSpecName: "inventory") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.556045 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.560636 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.564908 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "52250b90-fbc6-418e-9a5f-4873d5fa5cd0" (UID: "52250b90-fbc6-418e-9a5f-4873d5fa5cd0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.624559 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7p82\" (UniqueName: \"kubernetes.io/projected/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-kube-api-access-n7p82\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.624813 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.624966 4778 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.625095 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.625461 4778 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.625585 4778 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.625751 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/52250b90-fbc6-418e-9a5f-4873d5fa5cd0-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.922055 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.922321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v" event={"ID":"52250b90-fbc6-418e-9a5f-4873d5fa5cd0","Type":"ContainerDied","Data":"4209540ec7b376466182558df0c5b4d7f8fd041732dba8b257e8fb43f4388585"} Mar 18 09:46:16 crc kubenswrapper[4778]: I0318 09:46:16.922359 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4209540ec7b376466182558df0c5b4d7f8fd041732dba8b257e8fb43f4388585" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.036751 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr"] Mar 18 09:46:17 crc kubenswrapper[4778]: E0318 09:46:17.037109 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81f72c3-90fb-4526-97e3-977f3dbd00b0" containerName="oc" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.037125 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81f72c3-90fb-4526-97e3-977f3dbd00b0" containerName="oc" Mar 18 09:46:17 crc kubenswrapper[4778]: E0318 09:46:17.037140 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52250b90-fbc6-418e-9a5f-4873d5fa5cd0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.037147 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52250b90-fbc6-418e-9a5f-4873d5fa5cd0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.037311 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81f72c3-90fb-4526-97e3-977f3dbd00b0" containerName="oc" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.037338 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52250b90-fbc6-418e-9a5f-4873d5fa5cd0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.037860 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.040354 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.040502 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.040569 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.040766 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.040768 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.041680 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.059276 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr"] Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235382 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqtc4\" (UniqueName: \"kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235444 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235514 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235553 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235580 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.235602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.336996 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.337051 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.337356 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqtc4\" (UniqueName: \"kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.337415 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.337527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.337613 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.340797 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.341090 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.342125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.346744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.347438 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.362544 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqtc4\" (UniqueName: \"kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:17 crc kubenswrapper[4778]: I0318 09:46:17.659840 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:46:18 crc kubenswrapper[4778]: I0318 09:46:18.214056 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr"] Mar 18 09:46:18 crc kubenswrapper[4778]: I0318 09:46:18.940090 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" event={"ID":"d50b5540-c2ca-4889-bbb0-3b5d04bc602f","Type":"ContainerStarted","Data":"2ab6059febfad7a9f9307837f23ed27bc599c3e1e1aefffb4f2d067fd81fc840"} Mar 18 09:46:19 crc kubenswrapper[4778]: I0318 09:46:19.948560 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" event={"ID":"d50b5540-c2ca-4889-bbb0-3b5d04bc602f","Type":"ContainerStarted","Data":"3d07267fc8bce82aa6c1c143fb1b7b931cfc127a5bd399a0e485c50a9cb33804"} Mar 18 09:46:19 crc kubenswrapper[4778]: I0318 09:46:19.965456 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" podStartSLOduration=2.3635334820000002 podStartE2EDuration="2.965435961s" podCreationTimestamp="2026-03-18 09:46:17 +0000 UTC" firstStartedPulling="2026-03-18 09:46:18.216839225 +0000 UTC m=+2644.791584065" lastFinishedPulling="2026-03-18 09:46:18.818741674 +0000 UTC m=+2645.393486544" observedRunningTime="2026-03-18 09:46:19.963652862 +0000 UTC m=+2646.538397722" watchObservedRunningTime="2026-03-18 09:46:19.965435961 +0000 UTC m=+2646.540180811" Mar 18 09:46:20 crc kubenswrapper[4778]: I0318 09:46:20.187633 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:46:20 crc kubenswrapper[4778]: E0318 09:46:20.187901 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:46:32 crc kubenswrapper[4778]: I0318 09:46:32.559869 4778 scope.go:117] "RemoveContainer" containerID="9f45f4032f3621f6cd43ea95d13369122ace0eb37b6189c6643a14332da3a74a" Mar 18 09:46:35 crc kubenswrapper[4778]: I0318 09:46:35.188322 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:46:36 crc kubenswrapper[4778]: I0318 09:46:36.140897 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d"} Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.149064 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563788-pctk8"] Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.150806 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.153058 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.153478 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.153769 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.168355 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563788-pctk8"] Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.196418 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xglt8\" (UniqueName: \"kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8\") pod \"auto-csr-approver-29563788-pctk8\" (UID: \"dc64d6e3-ed19-4365-ab83-8c1af026054b\") " pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.299325 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xglt8\" (UniqueName: \"kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8\") pod \"auto-csr-approver-29563788-pctk8\" (UID: \"dc64d6e3-ed19-4365-ab83-8c1af026054b\") " pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.324212 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xglt8\" (UniqueName: \"kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8\") pod \"auto-csr-approver-29563788-pctk8\" (UID: \"dc64d6e3-ed19-4365-ab83-8c1af026054b\") " pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.470772 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:00 crc kubenswrapper[4778]: I0318 09:48:00.913656 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563788-pctk8"] Mar 18 09:48:01 crc kubenswrapper[4778]: I0318 09:48:01.033616 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563788-pctk8" event={"ID":"dc64d6e3-ed19-4365-ab83-8c1af026054b","Type":"ContainerStarted","Data":"e7b1e361c7d1374a9d9a7e89faa1a51a3b4b938d4325f0bf8d9fa2f63d5656a6"} Mar 18 09:48:03 crc kubenswrapper[4778]: I0318 09:48:03.061744 4778 generic.go:334] "Generic (PLEG): container finished" podID="dc64d6e3-ed19-4365-ab83-8c1af026054b" containerID="12a11cfc29f7b65306c1684f9c90110c5f5f19bee2195c78cf1dbf6c7f4120dd" exitCode=0 Mar 18 09:48:03 crc kubenswrapper[4778]: I0318 09:48:03.061855 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563788-pctk8" event={"ID":"dc64d6e3-ed19-4365-ab83-8c1af026054b","Type":"ContainerDied","Data":"12a11cfc29f7b65306c1684f9c90110c5f5f19bee2195c78cf1dbf6c7f4120dd"} Mar 18 09:48:04 crc kubenswrapper[4778]: I0318 09:48:04.395313 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:04 crc kubenswrapper[4778]: I0318 09:48:04.491733 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xglt8\" (UniqueName: \"kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8\") pod \"dc64d6e3-ed19-4365-ab83-8c1af026054b\" (UID: \"dc64d6e3-ed19-4365-ab83-8c1af026054b\") " Mar 18 09:48:04 crc kubenswrapper[4778]: I0318 09:48:04.501345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8" (OuterVolumeSpecName: "kube-api-access-xglt8") pod "dc64d6e3-ed19-4365-ab83-8c1af026054b" (UID: "dc64d6e3-ed19-4365-ab83-8c1af026054b"). InnerVolumeSpecName "kube-api-access-xglt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:48:04 crc kubenswrapper[4778]: I0318 09:48:04.595782 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xglt8\" (UniqueName: \"kubernetes.io/projected/dc64d6e3-ed19-4365-ab83-8c1af026054b-kube-api-access-xglt8\") on node \"crc\" DevicePath \"\"" Mar 18 09:48:05 crc kubenswrapper[4778]: I0318 09:48:05.086638 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563788-pctk8" event={"ID":"dc64d6e3-ed19-4365-ab83-8c1af026054b","Type":"ContainerDied","Data":"e7b1e361c7d1374a9d9a7e89faa1a51a3b4b938d4325f0bf8d9fa2f63d5656a6"} Mar 18 09:48:05 crc kubenswrapper[4778]: I0318 09:48:05.086705 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7b1e361c7d1374a9d9a7e89faa1a51a3b4b938d4325f0bf8d9fa2f63d5656a6" Mar 18 09:48:05 crc kubenswrapper[4778]: I0318 09:48:05.086787 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563788-pctk8" Mar 18 09:48:05 crc kubenswrapper[4778]: I0318 09:48:05.486722 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563782-zn5kg"] Mar 18 09:48:05 crc kubenswrapper[4778]: I0318 09:48:05.493884 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563782-zn5kg"] Mar 18 09:48:06 crc kubenswrapper[4778]: I0318 09:48:06.209720 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3895116-2d67-4e3c-9f3e-e04d3cfe0518" path="/var/lib/kubelet/pods/d3895116-2d67-4e3c-9f3e-e04d3cfe0518/volumes" Mar 18 09:48:32 crc kubenswrapper[4778]: I0318 09:48:32.646717 4778 scope.go:117] "RemoveContainer" containerID="c1ff920321931ae33985b21e2caf3b4db031e28a61271d5d1f2c36e681d955e6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.089564 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jztl6"] Mar 18 09:48:55 crc kubenswrapper[4778]: E0318 09:48:55.091395 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc64d6e3-ed19-4365-ab83-8c1af026054b" containerName="oc" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.091434 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc64d6e3-ed19-4365-ab83-8c1af026054b" containerName="oc" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.091970 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc64d6e3-ed19-4365-ab83-8c1af026054b" containerName="oc" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.095111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.117324 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jztl6"] Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.199274 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65mrl\" (UniqueName: \"kubernetes.io/projected/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-kube-api-access-65mrl\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.199318 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-utilities\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.199446 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-catalog-content\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.301159 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-catalog-content\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.301362 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65mrl\" (UniqueName: \"kubernetes.io/projected/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-kube-api-access-65mrl\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.301408 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-utilities\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.302128 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-utilities\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.302314 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-catalog-content\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.321181 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65mrl\" (UniqueName: \"kubernetes.io/projected/0c0dfa2e-b334-4eed-9e2f-3097f2b5102a-kube-api-access-65mrl\") pod \"community-operators-jztl6\" (UID: \"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a\") " pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.420177 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:48:55 crc kubenswrapper[4778]: I0318 09:48:55.973311 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jztl6"] Mar 18 09:48:56 crc kubenswrapper[4778]: I0318 09:48:56.576602 4778 generic.go:334] "Generic (PLEG): container finished" podID="0c0dfa2e-b334-4eed-9e2f-3097f2b5102a" containerID="406d268147ccbd6200b43ebee4807fc73359494cf62045cc84dfad45af9131fc" exitCode=0 Mar 18 09:48:56 crc kubenswrapper[4778]: I0318 09:48:56.576692 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jztl6" event={"ID":"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a","Type":"ContainerDied","Data":"406d268147ccbd6200b43ebee4807fc73359494cf62045cc84dfad45af9131fc"} Mar 18 09:48:56 crc kubenswrapper[4778]: I0318 09:48:56.578297 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jztl6" event={"ID":"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a","Type":"ContainerStarted","Data":"635a623f677257671dc346299ad868d6d659f6256d11fb27418a60853c67216b"} Mar 18 09:48:56 crc kubenswrapper[4778]: I0318 09:48:56.579917 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:49:00 crc kubenswrapper[4778]: I0318 09:49:00.147284 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:49:00 crc kubenswrapper[4778]: I0318 09:49:00.147913 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:49:00 crc kubenswrapper[4778]: I0318 09:49:00.625530 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jztl6" event={"ID":"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a","Type":"ContainerStarted","Data":"f540e4e75bb9d7a85bc9cc86ce188873a4bcab0bee0b8f301e74cc3af623bb8b"} Mar 18 09:49:01 crc kubenswrapper[4778]: I0318 09:49:01.637923 4778 generic.go:334] "Generic (PLEG): container finished" podID="0c0dfa2e-b334-4eed-9e2f-3097f2b5102a" containerID="f540e4e75bb9d7a85bc9cc86ce188873a4bcab0bee0b8f301e74cc3af623bb8b" exitCode=0 Mar 18 09:49:01 crc kubenswrapper[4778]: I0318 09:49:01.637992 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jztl6" event={"ID":"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a","Type":"ContainerDied","Data":"f540e4e75bb9d7a85bc9cc86ce188873a4bcab0bee0b8f301e74cc3af623bb8b"} Mar 18 09:49:02 crc kubenswrapper[4778]: I0318 09:49:02.650189 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jztl6" event={"ID":"0c0dfa2e-b334-4eed-9e2f-3097f2b5102a","Type":"ContainerStarted","Data":"3fa2561048c07e212e89fb360838163147894a07830056afc658fa4ddede2620"} Mar 18 09:49:02 crc kubenswrapper[4778]: I0318 09:49:02.669181 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jztl6" podStartSLOduration=2.111380907 podStartE2EDuration="7.669161489s" podCreationTimestamp="2026-03-18 09:48:55 +0000 UTC" firstStartedPulling="2026-03-18 09:48:56.579542559 +0000 UTC m=+2803.154287429" lastFinishedPulling="2026-03-18 09:49:02.137323171 +0000 UTC m=+2808.712068011" observedRunningTime="2026-03-18 09:49:02.667916625 +0000 UTC m=+2809.242661535" watchObservedRunningTime="2026-03-18 09:49:02.669161489 +0000 UTC m=+2809.243906329" Mar 18 09:49:05 crc kubenswrapper[4778]: I0318 09:49:05.422386 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:49:05 crc kubenswrapper[4778]: I0318 09:49:05.422838 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:49:05 crc kubenswrapper[4778]: I0318 09:49:05.465998 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.469448 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jztl6" Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.547777 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jztl6"] Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.619806 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.620117 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ktcxn" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="registry-server" containerID="cri-o://0d5497da92a3a6b067e66da6b34d1f0c05c8fc0ce853a92ab83f4966e6f9359e" gracePeriod=2 Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.781658 4778 generic.go:334] "Generic (PLEG): container finished" podID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerID="0d5497da92a3a6b067e66da6b34d1f0c05c8fc0ce853a92ab83f4966e6f9359e" exitCode=0 Mar 18 09:49:15 crc kubenswrapper[4778]: I0318 09:49:15.781751 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerDied","Data":"0d5497da92a3a6b067e66da6b34d1f0c05c8fc0ce853a92ab83f4966e6f9359e"} Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.101422 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.191862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities\") pod \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.192012 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s9q2\" (UniqueName: \"kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2\") pod \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.192069 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content\") pod \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\" (UID: \"fee87709-f8ed-4eb4-829e-1fdb6534bb35\") " Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.192541 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities" (OuterVolumeSpecName: "utilities") pod "fee87709-f8ed-4eb4-829e-1fdb6534bb35" (UID: "fee87709-f8ed-4eb4-829e-1fdb6534bb35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.199372 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2" (OuterVolumeSpecName: "kube-api-access-5s9q2") pod "fee87709-f8ed-4eb4-829e-1fdb6534bb35" (UID: "fee87709-f8ed-4eb4-829e-1fdb6534bb35"). InnerVolumeSpecName "kube-api-access-5s9q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.245854 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fee87709-f8ed-4eb4-829e-1fdb6534bb35" (UID: "fee87709-f8ed-4eb4-829e-1fdb6534bb35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.294510 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s9q2\" (UniqueName: \"kubernetes.io/projected/fee87709-f8ed-4eb4-829e-1fdb6534bb35-kube-api-access-5s9q2\") on node \"crc\" DevicePath \"\"" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.294539 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.294924 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee87709-f8ed-4eb4-829e-1fdb6534bb35-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.791208 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktcxn" event={"ID":"fee87709-f8ed-4eb4-829e-1fdb6534bb35","Type":"ContainerDied","Data":"1a86626e5d4576d55c9f62a59074b0761782b73c69b053e7829507e68652f471"} Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.791267 4778 scope.go:117] "RemoveContainer" containerID="0d5497da92a3a6b067e66da6b34d1f0c05c8fc0ce853a92ab83f4966e6f9359e" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.791359 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktcxn" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.821768 4778 scope.go:117] "RemoveContainer" containerID="d37142aca8df005734457524dffa32c4483716edffbcfb2d1b92b3701d6e7e1c" Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.825313 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.833475 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ktcxn"] Mar 18 09:49:16 crc kubenswrapper[4778]: I0318 09:49:16.855327 4778 scope.go:117] "RemoveContainer" containerID="6ecbe80389c09da7c5dfaf24f572df1adb64cba289f74a3e8339845f8cebe749" Mar 18 09:49:18 crc kubenswrapper[4778]: I0318 09:49:18.195968 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" path="/var/lib/kubelet/pods/fee87709-f8ed-4eb4-829e-1fdb6534bb35/volumes" Mar 18 09:49:30 crc kubenswrapper[4778]: I0318 09:49:30.147294 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:49:30 crc kubenswrapper[4778]: I0318 09:49:30.147911 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.148306 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.149153 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.149548 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.150409 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.150486 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d" gracePeriod=600 Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.159816 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563790-nsf4n"] Mar 18 09:50:00 crc kubenswrapper[4778]: E0318 09:50:00.160385 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="extract-utilities" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.160454 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="extract-utilities" Mar 18 09:50:00 crc kubenswrapper[4778]: E0318 09:50:00.160492 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="extract-content" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.160504 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="extract-content" Mar 18 09:50:00 crc kubenswrapper[4778]: E0318 09:50:00.160544 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="registry-server" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.160556 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="registry-server" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.160847 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee87709-f8ed-4eb4-829e-1fdb6534bb35" containerName="registry-server" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.161687 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.164024 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.164402 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.164548 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.178616 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563790-nsf4n"] Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.341383 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp68l\" (UniqueName: \"kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l\") pod \"auto-csr-approver-29563790-nsf4n\" (UID: \"08fbf495-18e2-4d61-ad96-1bf74db07f0e\") " pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.444396 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp68l\" (UniqueName: \"kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l\") pod \"auto-csr-approver-29563790-nsf4n\" (UID: \"08fbf495-18e2-4d61-ad96-1bf74db07f0e\") " pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.468834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp68l\" (UniqueName: \"kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l\") pod \"auto-csr-approver-29563790-nsf4n\" (UID: \"08fbf495-18e2-4d61-ad96-1bf74db07f0e\") " pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.518853 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:00 crc kubenswrapper[4778]: I0318 09:50:00.982175 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563790-nsf4n"] Mar 18 09:50:00 crc kubenswrapper[4778]: W0318 09:50:00.984782 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08fbf495_18e2_4d61_ad96_1bf74db07f0e.slice/crio-fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050 WatchSource:0}: Error finding container fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050: Status 404 returned error can't find the container with id fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050 Mar 18 09:50:01 crc kubenswrapper[4778]: I0318 09:50:01.200217 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" event={"ID":"08fbf495-18e2-4d61-ad96-1bf74db07f0e","Type":"ContainerStarted","Data":"fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050"} Mar 18 09:50:01 crc kubenswrapper[4778]: I0318 09:50:01.203376 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d" exitCode=0 Mar 18 09:50:01 crc kubenswrapper[4778]: I0318 09:50:01.203392 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d"} Mar 18 09:50:01 crc kubenswrapper[4778]: I0318 09:50:01.203421 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3"} Mar 18 09:50:01 crc kubenswrapper[4778]: I0318 09:50:01.203439 4778 scope.go:117] "RemoveContainer" containerID="0c3d824602f9395f26e4f8f313361382b28d1f6f1f3fe59450c06bf61d6095e9" Mar 18 09:50:03 crc kubenswrapper[4778]: I0318 09:50:03.239616 4778 generic.go:334] "Generic (PLEG): container finished" podID="08fbf495-18e2-4d61-ad96-1bf74db07f0e" containerID="58ef47c1a33dc103d35c1381547dc4531f738d5df6648d3b82a9b2e034b9599e" exitCode=0 Mar 18 09:50:03 crc kubenswrapper[4778]: I0318 09:50:03.239718 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" event={"ID":"08fbf495-18e2-4d61-ad96-1bf74db07f0e","Type":"ContainerDied","Data":"58ef47c1a33dc103d35c1381547dc4531f738d5df6648d3b82a9b2e034b9599e"} Mar 18 09:50:04 crc kubenswrapper[4778]: I0318 09:50:04.633371 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:04 crc kubenswrapper[4778]: I0318 09:50:04.827977 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp68l\" (UniqueName: \"kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l\") pod \"08fbf495-18e2-4d61-ad96-1bf74db07f0e\" (UID: \"08fbf495-18e2-4d61-ad96-1bf74db07f0e\") " Mar 18 09:50:04 crc kubenswrapper[4778]: I0318 09:50:04.833797 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l" (OuterVolumeSpecName: "kube-api-access-wp68l") pod "08fbf495-18e2-4d61-ad96-1bf74db07f0e" (UID: "08fbf495-18e2-4d61-ad96-1bf74db07f0e"). InnerVolumeSpecName "kube-api-access-wp68l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:50:04 crc kubenswrapper[4778]: I0318 09:50:04.931498 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp68l\" (UniqueName: \"kubernetes.io/projected/08fbf495-18e2-4d61-ad96-1bf74db07f0e-kube-api-access-wp68l\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:05 crc kubenswrapper[4778]: I0318 09:50:05.257304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" event={"ID":"08fbf495-18e2-4d61-ad96-1bf74db07f0e","Type":"ContainerDied","Data":"fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050"} Mar 18 09:50:05 crc kubenswrapper[4778]: I0318 09:50:05.257820 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd2c59cae4643aef1ce354ee05143c0cd4df887daa07d1b649bf9758d75f050" Mar 18 09:50:05 crc kubenswrapper[4778]: I0318 09:50:05.257355 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563790-nsf4n" Mar 18 09:50:05 crc kubenswrapper[4778]: I0318 09:50:05.710696 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563784-pdds9"] Mar 18 09:50:05 crc kubenswrapper[4778]: I0318 09:50:05.717500 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563784-pdds9"] Mar 18 09:50:06 crc kubenswrapper[4778]: I0318 09:50:06.196873 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab4f60ce-be48-4052-9fa7-905b70e65c3a" path="/var/lib/kubelet/pods/ab4f60ce-be48-4052-9fa7-905b70e65c3a/volumes" Mar 18 09:50:20 crc kubenswrapper[4778]: I0318 09:50:20.375474 4778 generic.go:334] "Generic (PLEG): container finished" podID="d50b5540-c2ca-4889-bbb0-3b5d04bc602f" containerID="3d07267fc8bce82aa6c1c143fb1b7b931cfc127a5bd399a0e485c50a9cb33804" exitCode=0 Mar 18 09:50:20 crc kubenswrapper[4778]: I0318 09:50:20.375554 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" event={"ID":"d50b5540-c2ca-4889-bbb0-3b5d04bc602f","Type":"ContainerDied","Data":"3d07267fc8bce82aa6c1c143fb1b7b931cfc127a5bd399a0e485c50a9cb33804"} Mar 18 09:50:21 crc kubenswrapper[4778]: I0318 09:50:21.925022 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033421 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033618 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033676 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqtc4\" (UniqueName: \"kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033699 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033745 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.033777 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph\") pod \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\" (UID: \"d50b5540-c2ca-4889-bbb0-3b5d04bc602f\") " Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.039759 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.040500 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph" (OuterVolumeSpecName: "ceph") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.055990 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4" (OuterVolumeSpecName: "kube-api-access-nqtc4") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "kube-api-access-nqtc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.064175 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.064467 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.069784 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory" (OuterVolumeSpecName: "inventory") pod "d50b5540-c2ca-4889-bbb0-3b5d04bc602f" (UID: "d50b5540-c2ca-4889-bbb0-3b5d04bc602f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.135595 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.135775 4778 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.135857 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqtc4\" (UniqueName: \"kubernetes.io/projected/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-kube-api-access-nqtc4\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.135966 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.136071 4778 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.136172 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d50b5540-c2ca-4889-bbb0-3b5d04bc602f-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.405388 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" event={"ID":"d50b5540-c2ca-4889-bbb0-3b5d04bc602f","Type":"ContainerDied","Data":"2ab6059febfad7a9f9307837f23ed27bc599c3e1e1aefffb4f2d067fd81fc840"} Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.405732 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.405765 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ab6059febfad7a9f9307837f23ed27bc599c3e1e1aefffb4f2d067fd81fc840" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.518794 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9"] Mar 18 09:50:22 crc kubenswrapper[4778]: E0318 09:50:22.519235 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50b5540-c2ca-4889-bbb0-3b5d04bc602f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.519257 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50b5540-c2ca-4889-bbb0-3b5d04bc602f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 09:50:22 crc kubenswrapper[4778]: E0318 09:50:22.519293 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fbf495-18e2-4d61-ad96-1bf74db07f0e" containerName="oc" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.519302 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fbf495-18e2-4d61-ad96-1bf74db07f0e" containerName="oc" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.519503 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fbf495-18e2-4d61-ad96-1bf74db07f0e" containerName="oc" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.519530 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d50b5540-c2ca-4889-bbb0-3b5d04bc602f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.520233 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535038 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535132 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535159 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535050 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535177 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535376 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535448 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535658 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vb8n" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.535673 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.536833 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9"] Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.543482 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.543721 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6zxg\" (UniqueName: \"kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.543822 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.543909 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.543975 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544051 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544140 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544257 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544394 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544613 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544683 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544804 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.544859 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.645947 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646170 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646306 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646452 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646530 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646591 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6zxg\" (UniqueName: \"kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646673 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646748 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646813 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646881 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.646960 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.647077 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.647785 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.648229 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.650506 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.651499 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.652484 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.653012 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.653091 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.653349 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.654102 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.654691 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.656849 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.657689 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.671703 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6zxg\" (UniqueName: \"kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:22 crc kubenswrapper[4778]: I0318 09:50:22.835807 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:50:23 crc kubenswrapper[4778]: I0318 09:50:23.359823 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9"] Mar 18 09:50:23 crc kubenswrapper[4778]: I0318 09:50:23.414358 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" event={"ID":"b2db5491-57b4-427a-b306-5e525a1e7c27","Type":"ContainerStarted","Data":"ae914e239ff54eca2bb96c1bbf0bed7d47de287f780b43c281d7c1dcccb9c71c"} Mar 18 09:50:24 crc kubenswrapper[4778]: I0318 09:50:24.422058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" event={"ID":"b2db5491-57b4-427a-b306-5e525a1e7c27","Type":"ContainerStarted","Data":"3b076b40e8f07ea23b1427f17330f5415549c41b6c6f9192b8ef848601e5be2b"} Mar 18 09:50:24 crc kubenswrapper[4778]: I0318 09:50:24.453863 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" podStartSLOduration=1.998170894 podStartE2EDuration="2.453841033s" podCreationTimestamp="2026-03-18 09:50:22 +0000 UTC" firstStartedPulling="2026-03-18 09:50:23.365328332 +0000 UTC m=+2889.940073182" lastFinishedPulling="2026-03-18 09:50:23.820998481 +0000 UTC m=+2890.395743321" observedRunningTime="2026-03-18 09:50:24.444526659 +0000 UTC m=+2891.019271509" watchObservedRunningTime="2026-03-18 09:50:24.453841033 +0000 UTC m=+2891.028585913" Mar 18 09:50:32 crc kubenswrapper[4778]: I0318 09:50:32.758466 4778 scope.go:117] "RemoveContainer" containerID="8f06326ea2269b974cec86640f654b1a9c686fe2bd62106254f2a194a59e658e" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.681422 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.683861 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.695634 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.744506 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26d47\" (UniqueName: \"kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.744698 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.744743 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.846787 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26d47\" (UniqueName: \"kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.846887 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.846920 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.847449 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.847482 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.866540 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26d47\" (UniqueName: \"kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47\") pod \"redhat-marketplace-fwdkk\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.885852 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.887978 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.901092 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.948982 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgc6j\" (UniqueName: \"kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.950512 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:45 crc kubenswrapper[4778]: I0318 09:51:45.950723 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.011917 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.052475 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.052585 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.052715 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgc6j\" (UniqueName: \"kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.053317 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.053323 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.071023 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgc6j\" (UniqueName: \"kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j\") pod \"redhat-operators-4xxv6\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.236427 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.542870 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:46 crc kubenswrapper[4778]: I0318 09:51:46.740094 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.224739 4778 generic.go:334] "Generic (PLEG): container finished" podID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerID="2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454" exitCode=0 Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.224864 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerDied","Data":"2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454"} Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.224928 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerStarted","Data":"89f244671c9af7d58f97b9b32aa5636eb413ce69e26ae52cac5094e2618def32"} Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.228067 4778 generic.go:334] "Generic (PLEG): container finished" podID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerID="63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d" exitCode=0 Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.228100 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerDied","Data":"63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d"} Mar 18 09:51:47 crc kubenswrapper[4778]: I0318 09:51:47.228122 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerStarted","Data":"c27672d46e7320acefef5001c13794ee1f57d5d043df0f1d75875dbe02c9990c"} Mar 18 09:51:48 crc kubenswrapper[4778]: I0318 09:51:48.244759 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerStarted","Data":"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4"} Mar 18 09:51:48 crc kubenswrapper[4778]: I0318 09:51:48.246884 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerStarted","Data":"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913"} Mar 18 09:51:49 crc kubenswrapper[4778]: I0318 09:51:49.266234 4778 generic.go:334] "Generic (PLEG): container finished" podID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerID="571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913" exitCode=0 Mar 18 09:51:49 crc kubenswrapper[4778]: I0318 09:51:49.266470 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerDied","Data":"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913"} Mar 18 09:51:50 crc kubenswrapper[4778]: I0318 09:51:50.278307 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerStarted","Data":"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e"} Mar 18 09:51:50 crc kubenswrapper[4778]: I0318 09:51:50.306667 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fwdkk" podStartSLOduration=2.864637789 podStartE2EDuration="5.306644439s" podCreationTimestamp="2026-03-18 09:51:45 +0000 UTC" firstStartedPulling="2026-03-18 09:51:47.230305977 +0000 UTC m=+2973.805050817" lastFinishedPulling="2026-03-18 09:51:49.672312617 +0000 UTC m=+2976.247057467" observedRunningTime="2026-03-18 09:51:50.299214618 +0000 UTC m=+2976.873959478" watchObservedRunningTime="2026-03-18 09:51:50.306644439 +0000 UTC m=+2976.881389279" Mar 18 09:51:51 crc kubenswrapper[4778]: I0318 09:51:51.289642 4778 generic.go:334] "Generic (PLEG): container finished" podID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerID="5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4" exitCode=0 Mar 18 09:51:51 crc kubenswrapper[4778]: I0318 09:51:51.289698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerDied","Data":"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4"} Mar 18 09:51:52 crc kubenswrapper[4778]: I0318 09:51:52.300123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerStarted","Data":"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7"} Mar 18 09:51:52 crc kubenswrapper[4778]: I0318 09:51:52.320509 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4xxv6" podStartSLOduration=2.680681912 podStartE2EDuration="7.320494517s" podCreationTimestamp="2026-03-18 09:51:45 +0000 UTC" firstStartedPulling="2026-03-18 09:51:47.226852524 +0000 UTC m=+2973.801597364" lastFinishedPulling="2026-03-18 09:51:51.866665109 +0000 UTC m=+2978.441409969" observedRunningTime="2026-03-18 09:51:52.317318961 +0000 UTC m=+2978.892063811" watchObservedRunningTime="2026-03-18 09:51:52.320494517 +0000 UTC m=+2978.895239357" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.012057 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.012687 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.061533 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.237830 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.237882 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.387661 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:56 crc kubenswrapper[4778]: I0318 09:51:56.875143 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:57 crc kubenswrapper[4778]: I0318 09:51:57.294952 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4xxv6" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="registry-server" probeResult="failure" output=< Mar 18 09:51:57 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 09:51:57 crc kubenswrapper[4778]: > Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.345124 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fwdkk" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="registry-server" containerID="cri-o://4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e" gracePeriod=2 Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.830042 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.910311 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities\") pod \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.910429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content\") pod \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.910480 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26d47\" (UniqueName: \"kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47\") pod \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\" (UID: \"a91e5adc-fb5f-44af-9f4e-43c57ecece37\") " Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.912523 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities" (OuterVolumeSpecName: "utilities") pod "a91e5adc-fb5f-44af-9f4e-43c57ecece37" (UID: "a91e5adc-fb5f-44af-9f4e-43c57ecece37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.916346 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47" (OuterVolumeSpecName: "kube-api-access-26d47") pod "a91e5adc-fb5f-44af-9f4e-43c57ecece37" (UID: "a91e5adc-fb5f-44af-9f4e-43c57ecece37"). InnerVolumeSpecName "kube-api-access-26d47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:51:58 crc kubenswrapper[4778]: I0318 09:51:58.942454 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a91e5adc-fb5f-44af-9f4e-43c57ecece37" (UID: "a91e5adc-fb5f-44af-9f4e-43c57ecece37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.012023 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.012085 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26d47\" (UniqueName: \"kubernetes.io/projected/a91e5adc-fb5f-44af-9f4e-43c57ecece37-kube-api-access-26d47\") on node \"crc\" DevicePath \"\"" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.012101 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91e5adc-fb5f-44af-9f4e-43c57ecece37-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.355562 4778 generic.go:334] "Generic (PLEG): container finished" podID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerID="4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e" exitCode=0 Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.355605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerDied","Data":"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e"} Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.355633 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fwdkk" event={"ID":"a91e5adc-fb5f-44af-9f4e-43c57ecece37","Type":"ContainerDied","Data":"c27672d46e7320acefef5001c13794ee1f57d5d043df0f1d75875dbe02c9990c"} Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.355649 4778 scope.go:117] "RemoveContainer" containerID="4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.355776 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fwdkk" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.415898 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.425396 4778 scope.go:117] "RemoveContainer" containerID="571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.426311 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fwdkk"] Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.475940 4778 scope.go:117] "RemoveContainer" containerID="63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.508826 4778 scope.go:117] "RemoveContainer" containerID="4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e" Mar 18 09:51:59 crc kubenswrapper[4778]: E0318 09:51:59.509338 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e\": container with ID starting with 4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e not found: ID does not exist" containerID="4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.509371 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e"} err="failed to get container status \"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e\": rpc error: code = NotFound desc = could not find container \"4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e\": container with ID starting with 4e3badb418091925445d6d0fbc783a188a39c515e66b0126b3b33f8a06d0077e not found: ID does not exist" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.509406 4778 scope.go:117] "RemoveContainer" containerID="571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913" Mar 18 09:51:59 crc kubenswrapper[4778]: E0318 09:51:59.509811 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913\": container with ID starting with 571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913 not found: ID does not exist" containerID="571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.509852 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913"} err="failed to get container status \"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913\": rpc error: code = NotFound desc = could not find container \"571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913\": container with ID starting with 571cb1940b6dd6bb12bc11daa304e910b735c2ebc35f47fa3e4ff4294bf41913 not found: ID does not exist" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.509868 4778 scope.go:117] "RemoveContainer" containerID="63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d" Mar 18 09:51:59 crc kubenswrapper[4778]: E0318 09:51:59.510123 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d\": container with ID starting with 63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d not found: ID does not exist" containerID="63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d" Mar 18 09:51:59 crc kubenswrapper[4778]: I0318 09:51:59.510145 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d"} err="failed to get container status \"63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d\": rpc error: code = NotFound desc = could not find container \"63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d\": container with ID starting with 63782f621a7d608c13177a519f47974df44ab0d96e754d15f5547795b7fe9a0d not found: ID does not exist" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.147113 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.147180 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.161300 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563792-4g4zq"] Mar 18 09:52:00 crc kubenswrapper[4778]: E0318 09:52:00.161686 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="registry-server" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.161705 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="registry-server" Mar 18 09:52:00 crc kubenswrapper[4778]: E0318 09:52:00.161727 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="extract-content" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.161734 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="extract-content" Mar 18 09:52:00 crc kubenswrapper[4778]: E0318 09:52:00.161755 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="extract-utilities" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.161765 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="extract-utilities" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.161952 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" containerName="registry-server" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.162585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.165478 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.165622 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.175779 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.178948 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563792-4g4zq"] Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.202579 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a91e5adc-fb5f-44af-9f4e-43c57ecece37" path="/var/lib/kubelet/pods/a91e5adc-fb5f-44af-9f4e-43c57ecece37/volumes" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.235568 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm2lk\" (UniqueName: \"kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk\") pod \"auto-csr-approver-29563792-4g4zq\" (UID: \"a7095f92-8336-4c69-9c71-c3b9aa45bb82\") " pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.336652 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm2lk\" (UniqueName: \"kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk\") pod \"auto-csr-approver-29563792-4g4zq\" (UID: \"a7095f92-8336-4c69-9c71-c3b9aa45bb82\") " pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.354494 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm2lk\" (UniqueName: \"kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk\") pod \"auto-csr-approver-29563792-4g4zq\" (UID: \"a7095f92-8336-4c69-9c71-c3b9aa45bb82\") " pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.479677 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:00 crc kubenswrapper[4778]: I0318 09:52:00.919121 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563792-4g4zq"] Mar 18 09:52:01 crc kubenswrapper[4778]: I0318 09:52:01.374317 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" event={"ID":"a7095f92-8336-4c69-9c71-c3b9aa45bb82","Type":"ContainerStarted","Data":"23f7c02e5a9c6e20be9eb80c9194cf2d4fc7da4d970b1593d3b268df4135066a"} Mar 18 09:52:03 crc kubenswrapper[4778]: I0318 09:52:03.388354 4778 generic.go:334] "Generic (PLEG): container finished" podID="a7095f92-8336-4c69-9c71-c3b9aa45bb82" containerID="129d18099eafc9ec58cca914d6f8f45f3f345a43be2618db7b6619ab09177632" exitCode=0 Mar 18 09:52:03 crc kubenswrapper[4778]: I0318 09:52:03.388803 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" event={"ID":"a7095f92-8336-4c69-9c71-c3b9aa45bb82","Type":"ContainerDied","Data":"129d18099eafc9ec58cca914d6f8f45f3f345a43be2618db7b6619ab09177632"} Mar 18 09:52:04 crc kubenswrapper[4778]: I0318 09:52:04.760014 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:04 crc kubenswrapper[4778]: I0318 09:52:04.931413 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm2lk\" (UniqueName: \"kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk\") pod \"a7095f92-8336-4c69-9c71-c3b9aa45bb82\" (UID: \"a7095f92-8336-4c69-9c71-c3b9aa45bb82\") " Mar 18 09:52:04 crc kubenswrapper[4778]: I0318 09:52:04.937681 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk" (OuterVolumeSpecName: "kube-api-access-vm2lk") pod "a7095f92-8336-4c69-9c71-c3b9aa45bb82" (UID: "a7095f92-8336-4c69-9c71-c3b9aa45bb82"). InnerVolumeSpecName "kube-api-access-vm2lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.034008 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm2lk\" (UniqueName: \"kubernetes.io/projected/a7095f92-8336-4c69-9c71-c3b9aa45bb82-kube-api-access-vm2lk\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.422639 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" event={"ID":"a7095f92-8336-4c69-9c71-c3b9aa45bb82","Type":"ContainerDied","Data":"23f7c02e5a9c6e20be9eb80c9194cf2d4fc7da4d970b1593d3b268df4135066a"} Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.423095 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f7c02e5a9c6e20be9eb80c9194cf2d4fc7da4d970b1593d3b268df4135066a" Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.423236 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563792-4g4zq" Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.832800 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563786-wpjmv"] Mar 18 09:52:05 crc kubenswrapper[4778]: I0318 09:52:05.840765 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563786-wpjmv"] Mar 18 09:52:06 crc kubenswrapper[4778]: I0318 09:52:06.198356 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81f72c3-90fb-4526-97e3-977f3dbd00b0" path="/var/lib/kubelet/pods/e81f72c3-90fb-4526-97e3-977f3dbd00b0/volumes" Mar 18 09:52:06 crc kubenswrapper[4778]: I0318 09:52:06.289832 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:52:06 crc kubenswrapper[4778]: I0318 09:52:06.336551 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:52:06 crc kubenswrapper[4778]: I0318 09:52:06.524434 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.436431 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4xxv6" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="registry-server" containerID="cri-o://869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7" gracePeriod=2 Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.850089 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.885253 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgc6j\" (UniqueName: \"kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j\") pod \"c4b995fc-abe8-41af-9287-6381d6a3f37e\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.885375 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content\") pod \"c4b995fc-abe8-41af-9287-6381d6a3f37e\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.911223 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j" (OuterVolumeSpecName: "kube-api-access-xgc6j") pod "c4b995fc-abe8-41af-9287-6381d6a3f37e" (UID: "c4b995fc-abe8-41af-9287-6381d6a3f37e"). InnerVolumeSpecName "kube-api-access-xgc6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.986952 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities\") pod \"c4b995fc-abe8-41af-9287-6381d6a3f37e\" (UID: \"c4b995fc-abe8-41af-9287-6381d6a3f37e\") " Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.987693 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgc6j\" (UniqueName: \"kubernetes.io/projected/c4b995fc-abe8-41af-9287-6381d6a3f37e-kube-api-access-xgc6j\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:07 crc kubenswrapper[4778]: I0318 09:52:07.987919 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities" (OuterVolumeSpecName: "utilities") pod "c4b995fc-abe8-41af-9287-6381d6a3f37e" (UID: "c4b995fc-abe8-41af-9287-6381d6a3f37e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.018069 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4b995fc-abe8-41af-9287-6381d6a3f37e" (UID: "c4b995fc-abe8-41af-9287-6381d6a3f37e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.231139 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.231254 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4b995fc-abe8-41af-9287-6381d6a3f37e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.447499 4778 generic.go:334] "Generic (PLEG): container finished" podID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerID="869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7" exitCode=0 Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.447553 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerDied","Data":"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7"} Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.447827 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xxv6" event={"ID":"c4b995fc-abe8-41af-9287-6381d6a3f37e","Type":"ContainerDied","Data":"89f244671c9af7d58f97b9b32aa5636eb413ce69e26ae52cac5094e2618def32"} Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.447844 4778 scope.go:117] "RemoveContainer" containerID="869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.447569 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xxv6" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.467472 4778 scope.go:117] "RemoveContainer" containerID="5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.470408 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.478149 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4xxv6"] Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.488714 4778 scope.go:117] "RemoveContainer" containerID="2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.528186 4778 scope.go:117] "RemoveContainer" containerID="869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7" Mar 18 09:52:08 crc kubenswrapper[4778]: E0318 09:52:08.528724 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7\": container with ID starting with 869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7 not found: ID does not exist" containerID="869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.528759 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7"} err="failed to get container status \"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7\": rpc error: code = NotFound desc = could not find container \"869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7\": container with ID starting with 869abeed529a5e1453d828ad8454ff7a053a3d1e42084abadd579ff37282dee7 not found: ID does not exist" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.528787 4778 scope.go:117] "RemoveContainer" containerID="5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4" Mar 18 09:52:08 crc kubenswrapper[4778]: E0318 09:52:08.529254 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4\": container with ID starting with 5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4 not found: ID does not exist" containerID="5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.529283 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4"} err="failed to get container status \"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4\": rpc error: code = NotFound desc = could not find container \"5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4\": container with ID starting with 5b9d7727560aa9fa64c3d4e01ab24c3b42d06150409b0431aac0bbfe50f5e1f4 not found: ID does not exist" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.529302 4778 scope.go:117] "RemoveContainer" containerID="2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454" Mar 18 09:52:08 crc kubenswrapper[4778]: E0318 09:52:08.529586 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454\": container with ID starting with 2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454 not found: ID does not exist" containerID="2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454" Mar 18 09:52:08 crc kubenswrapper[4778]: I0318 09:52:08.529637 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454"} err="failed to get container status \"2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454\": rpc error: code = NotFound desc = could not find container \"2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454\": container with ID starting with 2e70ef6fdf1bdbf547f064293d4f418b4758bc99ffb9228a6547572547ea2454 not found: ID does not exist" Mar 18 09:52:10 crc kubenswrapper[4778]: I0318 09:52:10.207434 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" path="/var/lib/kubelet/pods/c4b995fc-abe8-41af-9287-6381d6a3f37e/volumes" Mar 18 09:52:30 crc kubenswrapper[4778]: I0318 09:52:30.148839 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:52:30 crc kubenswrapper[4778]: I0318 09:52:30.149535 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:52:32 crc kubenswrapper[4778]: I0318 09:52:32.877563 4778 scope.go:117] "RemoveContainer" containerID="f0336ddfd7a0dbb015d37a5f5151d0bd63e8c2d9a92eb6c0cfc48a0cb9420252" Mar 18 09:52:57 crc kubenswrapper[4778]: I0318 09:52:57.924059 4778 generic.go:334] "Generic (PLEG): container finished" podID="b2db5491-57b4-427a-b306-5e525a1e7c27" containerID="3b076b40e8f07ea23b1427f17330f5415549c41b6c6f9192b8ef848601e5be2b" exitCode=0 Mar 18 09:52:57 crc kubenswrapper[4778]: I0318 09:52:57.924162 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" event={"ID":"b2db5491-57b4-427a-b306-5e525a1e7c27","Type":"ContainerDied","Data":"3b076b40e8f07ea23b1427f17330f5415549c41b6c6f9192b8ef848601e5be2b"} Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.394694 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552268 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552384 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6zxg\" (UniqueName: \"kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552422 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552442 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552465 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552491 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552533 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552554 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552599 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552646 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552671 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.552730 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0\") pod \"b2db5491-57b4-427a-b306-5e525a1e7c27\" (UID: \"b2db5491-57b4-427a-b306-5e525a1e7c27\") " Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.558486 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.558550 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph" (OuterVolumeSpecName: "ceph") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.559642 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg" (OuterVolumeSpecName: "kube-api-access-z6zxg") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "kube-api-access-z6zxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.578237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.585705 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.586157 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.588569 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.590381 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.593392 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.594572 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.597440 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory" (OuterVolumeSpecName: "inventory") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.598356 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.605357 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "b2db5491-57b4-427a-b306-5e525a1e7c27" (UID: "b2db5491-57b4-427a-b306-5e525a1e7c27"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656048 4778 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656115 4778 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656137 4778 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656158 4778 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656177 4778 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656223 4778 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656246 4778 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656266 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656287 4778 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656306 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656329 4778 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656347 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6zxg\" (UniqueName: \"kubernetes.io/projected/b2db5491-57b4-427a-b306-5e525a1e7c27-kube-api-access-z6zxg\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.656368 4778 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b2db5491-57b4-427a-b306-5e525a1e7c27-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.949333 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" event={"ID":"b2db5491-57b4-427a-b306-5e525a1e7c27","Type":"ContainerDied","Data":"ae914e239ff54eca2bb96c1bbf0bed7d47de287f780b43c281d7c1dcccb9c71c"} Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.949660 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae914e239ff54eca2bb96c1bbf0bed7d47de287f780b43c281d7c1dcccb9c71c" Mar 18 09:52:59 crc kubenswrapper[4778]: I0318 09:52:59.949436 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.147592 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.147916 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.148045 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.148904 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.149057 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" gracePeriod=600 Mar 18 09:53:00 crc kubenswrapper[4778]: E0318 09:53:00.268005 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.958532 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" exitCode=0 Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.959418 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3"} Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.959527 4778 scope.go:117] "RemoveContainer" containerID="5df8914ca5e61db32f1727722fb361804c23982bfeee84574751dabf5cd01a2d" Mar 18 09:53:00 crc kubenswrapper[4778]: I0318 09:53:00.960153 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:53:00 crc kubenswrapper[4778]: E0318 09:53:00.960451 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:12 crc kubenswrapper[4778]: I0318 09:53:12.187921 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:53:12 crc kubenswrapper[4778]: E0318 09:53:12.188932 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.337422 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 18 09:53:14 crc kubenswrapper[4778]: E0318 09:53:14.338250 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="extract-content" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338270 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="extract-content" Mar 18 09:53:14 crc kubenswrapper[4778]: E0318 09:53:14.338288 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="registry-server" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338297 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="registry-server" Mar 18 09:53:14 crc kubenswrapper[4778]: E0318 09:53:14.338305 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="extract-utilities" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338313 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="extract-utilities" Mar 18 09:53:14 crc kubenswrapper[4778]: E0318 09:53:14.338333 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2db5491-57b4-427a-b306-5e525a1e7c27" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338342 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2db5491-57b4-427a-b306-5e525a1e7c27" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 18 09:53:14 crc kubenswrapper[4778]: E0318 09:53:14.338365 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7095f92-8336-4c69-9c71-c3b9aa45bb82" containerName="oc" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338373 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7095f92-8336-4c69-9c71-c3b9aa45bb82" containerName="oc" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338598 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b995fc-abe8-41af-9287-6381d6a3f37e" containerName="registry-server" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338626 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2db5491-57b4-427a-b306-5e525a1e7c27" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.338643 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7095f92-8336-4c69-9c71-c3b9aa45bb82" containerName="oc" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.339769 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.341799 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.342025 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.346855 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.348267 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.350185 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.364552 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.425608 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.471894 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.471937 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.471956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.471980 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.471996 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-dev\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472026 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-ceph\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472083 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-dev\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472114 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472132 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472145 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-lib-modules\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472209 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472225 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472244 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472268 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472317 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472339 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472357 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472405 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-scripts\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472448 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-sys\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472510 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpgcf\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-kube-api-access-xpgcf\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472537 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472565 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvphs\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-kube-api-access-bvphs\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472605 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472627 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472681 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-run\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472717 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-run\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472750 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472773 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.472796 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-sys\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.574550 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-sys\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.574851 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpgcf\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-kube-api-access-xpgcf\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.574684 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-sys\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.574961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575051 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvphs\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-kube-api-access-bvphs\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575144 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575189 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575252 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575290 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-run\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575318 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-run\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575341 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575364 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575389 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-run\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575401 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-sys\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575410 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-run\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575434 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-sys\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575454 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575500 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575457 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575524 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575647 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575672 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-dev\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575702 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575710 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575762 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575786 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-ceph\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-dev\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575821 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.575876 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-dev\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576057 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-lib-modules\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576084 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576093 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-dev\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576155 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576180 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-lib-modules\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576185 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576230 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576288 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576329 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576412 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576461 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576483 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576529 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-scripts\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.576580 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577011 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577060 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577143 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577460 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/81d18509-d2fc-47e2-b814-94c4807a4dd6-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577494 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.577523 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.582094 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-scripts\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.582550 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-ceph\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.583342 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.583398 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.583909 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.589284 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.589424 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.590043 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d18509-d2fc-47e2-b814-94c4807a4dd6-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.591769 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.592352 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-config-data\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.595307 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpgcf\" (UniqueName: \"kubernetes.io/projected/81d18509-d2fc-47e2-b814-94c4807a4dd6-kube-api-access-xpgcf\") pod \"cinder-volume-volume1-0\" (UID: \"81d18509-d2fc-47e2-b814-94c4807a4dd6\") " pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.595367 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvphs\" (UniqueName: \"kubernetes.io/projected/a419ad60-27c7-4a74-a7a0-f6b04b3bcb13-kube-api-access-bvphs\") pod \"cinder-backup-0\" (UID: \"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13\") " pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.664547 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.724797 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.846010 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-j5mf6"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.848871 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.859770 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-j5mf6"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.946760 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-91af-account-create-update-cc4d5"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.948407 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.952739 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.964666 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-91af-account-create-update-cc4d5"] Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.984390 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wmjx\" (UniqueName: \"kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:14 crc kubenswrapper[4778]: I0318 09:53:14.984489 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.085816 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skprq\" (UniqueName: \"kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.085993 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmjx\" (UniqueName: \"kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.086073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.086099 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.087760 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.107307 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wmjx\" (UniqueName: \"kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx\") pod \"manila-db-create-j5mf6\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.130438 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.133134 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.135990 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.135990 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.136348 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ntc8r" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.136411 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.142276 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.182318 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.189884 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.190064 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skprq\" (UniqueName: \"kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.191017 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.196312 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.198247 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.200879 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.201150 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.204254 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.217388 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skprq\" (UniqueName: \"kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq\") pod \"manila-91af-account-create-update-cc4d5\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.271036 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293017 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-logs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293149 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293182 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5mf\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-kube-api-access-6z5mf\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293256 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293291 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293354 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293462 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293523 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.293603 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-ceph\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.351995 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.397591 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.397676 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.397718 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.397764 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.397799 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400351 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsj6f\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-kube-api-access-gsj6f\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400710 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400778 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400832 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400878 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.400977 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-ceph\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401113 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-logs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401175 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401310 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401349 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5mf\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-kube-api-access-6z5mf\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401417 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.401451 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.404317 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.405311 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-logs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.407408 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.408089 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-ceph\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.408854 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.409064 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.409955 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.413272 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.422845 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5mf\" (UniqueName: \"kubernetes.io/projected/8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd-kube-api-access-6z5mf\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: W0318 09:53:15.431010 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81d18509_d2fc_47e2_b814_94c4807a4dd6.slice/crio-0d11d6b09e6f4473921af3645ab9916dc5f7e65427d3182e0f95832cfffd36ea WatchSource:0}: Error finding container 0d11d6b09e6f4473921af3645ab9916dc5f7e65427d3182e0f95832cfffd36ea: Status 404 returned error can't find the container with id 0d11d6b09e6f4473921af3645ab9916dc5f7e65427d3182e0f95832cfffd36ea Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.432284 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.442688 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd\") " pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.457076 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.503798 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.503858 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.503927 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.503951 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsj6f\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-kube-api-access-gsj6f\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.504000 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.504080 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.504135 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.504240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.504284 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.505114 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.505727 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.506566 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.509884 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.511987 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.512072 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.513470 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.521410 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.524501 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsj6f\" (UniqueName: \"kubernetes.io/projected/a18a46b5-39a7-4da9-8994-5c4716bc0fc3-kube-api-access-gsj6f\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.561869 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a18a46b5-39a7-4da9-8994-5c4716bc0fc3\") " pod="openstack/glance-default-internal-api-0" Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.645611 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-j5mf6"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.762808 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-91af-account-create-update-cc4d5"] Mar 18 09:53:15 crc kubenswrapper[4778]: I0318 09:53:15.821478 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.126910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13","Type":"ContainerStarted","Data":"8385277fcdfe2cd15730c0a5c485b06db1e943ab2dc423f994281197c507a39e"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.149721 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.151526 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-j5mf6" event={"ID":"57dd6190-5149-44a9-8a75-7e3d9077a43c","Type":"ContainerStarted","Data":"fc0fef996b4b9a5437de59c8b2ac8a5e7d95ba6ac33a74f54e7f79985c001e66"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.151582 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-j5mf6" event={"ID":"57dd6190-5149-44a9-8a75-7e3d9077a43c","Type":"ContainerStarted","Data":"a42c12386bc9161895c66f5178800674e5acdcd3879ece3c4d3b64226793baed"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.160516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-91af-account-create-update-cc4d5" event={"ID":"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0","Type":"ContainerStarted","Data":"a03567ec27a1b447b64f31d017f135a530faf96727b9cdb25f25df3c2b11ab27"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.160575 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-91af-account-create-update-cc4d5" event={"ID":"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0","Type":"ContainerStarted","Data":"4e079b381ae16533a0c4f019135452d3abe9616c161f7f866e0e5b17f9a6ed6d"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.164695 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"81d18509-d2fc-47e2-b814-94c4807a4dd6","Type":"ContainerStarted","Data":"0d11d6b09e6f4473921af3645ab9916dc5f7e65427d3182e0f95832cfffd36ea"} Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.184926 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-j5mf6" podStartSLOduration=2.184908718 podStartE2EDuration="2.184908718s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:16.179664965 +0000 UTC m=+3062.754409815" watchObservedRunningTime="2026-03-18 09:53:16.184908718 +0000 UTC m=+3062.759653558" Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.202172 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-91af-account-create-update-cc4d5" podStartSLOduration=2.202151686 podStartE2EDuration="2.202151686s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:16.193735647 +0000 UTC m=+3062.768480487" watchObservedRunningTime="2026-03-18 09:53:16.202151686 +0000 UTC m=+3062.776896526" Mar 18 09:53:16 crc kubenswrapper[4778]: I0318 09:53:16.412131 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 09:53:16 crc kubenswrapper[4778]: W0318 09:53:16.420561 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda18a46b5_39a7_4da9_8994_5c4716bc0fc3.slice/crio-c470e3848c8d0d6eb31f8ce6345f18395471221c351cc8d486bcd5ef90ac1c27 WatchSource:0}: Error finding container c470e3848c8d0d6eb31f8ce6345f18395471221c351cc8d486bcd5ef90ac1c27: Status 404 returned error can't find the container with id c470e3848c8d0d6eb31f8ce6345f18395471221c351cc8d486bcd5ef90ac1c27 Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.189585 4778 generic.go:334] "Generic (PLEG): container finished" podID="c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" containerID="a03567ec27a1b447b64f31d017f135a530faf96727b9cdb25f25df3c2b11ab27" exitCode=0 Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.190168 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-91af-account-create-update-cc4d5" event={"ID":"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0","Type":"ContainerDied","Data":"a03567ec27a1b447b64f31d017f135a530faf96727b9cdb25f25df3c2b11ab27"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.203070 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"81d18509-d2fc-47e2-b814-94c4807a4dd6","Type":"ContainerStarted","Data":"f67a1fbff496f1be1b4a5ab25667b7894e5bc479f79274dc8a68a3e3b0d0f449"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.225389 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a18a46b5-39a7-4da9-8994-5c4716bc0fc3","Type":"ContainerStarted","Data":"1bdefb86686f4948b49677f54ecf66216845e261d7b9635b47c514601effcb59"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.225447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a18a46b5-39a7-4da9-8994-5c4716bc0fc3","Type":"ContainerStarted","Data":"c470e3848c8d0d6eb31f8ce6345f18395471221c351cc8d486bcd5ef90ac1c27"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.230890 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd","Type":"ContainerStarted","Data":"29bf8f14f65ee985d6e1a81ccba68541643fe934836bc5811a00b9cf6e9d3255"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.231004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd","Type":"ContainerStarted","Data":"83a869fb7c2b3e59aa1a1dfb377b28fef14e4860992724e6ab3c0e03c5ed2d14"} Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.235854 4778 generic.go:334] "Generic (PLEG): container finished" podID="57dd6190-5149-44a9-8a75-7e3d9077a43c" containerID="fc0fef996b4b9a5437de59c8b2ac8a5e7d95ba6ac33a74f54e7f79985c001e66" exitCode=0 Mar 18 09:53:17 crc kubenswrapper[4778]: I0318 09:53:17.235907 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-j5mf6" event={"ID":"57dd6190-5149-44a9-8a75-7e3d9077a43c","Type":"ContainerDied","Data":"fc0fef996b4b9a5437de59c8b2ac8a5e7d95ba6ac33a74f54e7f79985c001e66"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.245109 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd","Type":"ContainerStarted","Data":"33986289d3f8e0f6641360b3e47ccb80fb9d1a8d78d05c15ad5cb1a1e9b60ccf"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.248152 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"81d18509-d2fc-47e2-b814-94c4807a4dd6","Type":"ContainerStarted","Data":"80de6fde3eb7f840dfb0144a8e27c20ffddf1efc593764891f2df40651f3dd80"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.250483 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13","Type":"ContainerStarted","Data":"810e9cda4787b4046333e5e6b466e35bd8a8305077a7123eae4208c665d2544b"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.250637 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a419ad60-27c7-4a74-a7a0-f6b04b3bcb13","Type":"ContainerStarted","Data":"1f3e135394d7ac164c95b4e8c0d3adbf4f449320b8725260d7d3e5a8a98d53b5"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.254644 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a18a46b5-39a7-4da9-8994-5c4716bc0fc3","Type":"ContainerStarted","Data":"69b7e9d6fec514c78d542744cb69ed2d17add8756390d1b40ae8a6652b4fe172"} Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.281315 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.281296368 podStartE2EDuration="4.281296368s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:18.271564694 +0000 UTC m=+3064.846309554" watchObservedRunningTime="2026-03-18 09:53:18.281296368 +0000 UTC m=+3064.856041208" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.312140 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.832388597 podStartE2EDuration="4.312120085s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="2026-03-18 09:53:15.434134132 +0000 UTC m=+3062.008878972" lastFinishedPulling="2026-03-18 09:53:16.91386562 +0000 UTC m=+3063.488610460" observedRunningTime="2026-03-18 09:53:18.308152488 +0000 UTC m=+3064.882897348" watchObservedRunningTime="2026-03-18 09:53:18.312120085 +0000 UTC m=+3064.886864925" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.340300 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.3402791 podStartE2EDuration="4.3402791s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:18.32848418 +0000 UTC m=+3064.903229040" watchObservedRunningTime="2026-03-18 09:53:18.3402791 +0000 UTC m=+3064.915023940" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.397463 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.821254175 podStartE2EDuration="4.397445654s" podCreationTimestamp="2026-03-18 09:53:14 +0000 UTC" firstStartedPulling="2026-03-18 09:53:15.337337262 +0000 UTC m=+3061.912082102" lastFinishedPulling="2026-03-18 09:53:16.913528741 +0000 UTC m=+3063.488273581" observedRunningTime="2026-03-18 09:53:18.351878415 +0000 UTC m=+3064.926623255" watchObservedRunningTime="2026-03-18 09:53:18.397445654 +0000 UTC m=+3064.972190494" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.582990 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.692650 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts\") pod \"57dd6190-5149-44a9-8a75-7e3d9077a43c\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.693020 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wmjx\" (UniqueName: \"kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx\") pod \"57dd6190-5149-44a9-8a75-7e3d9077a43c\" (UID: \"57dd6190-5149-44a9-8a75-7e3d9077a43c\") " Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.693754 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57dd6190-5149-44a9-8a75-7e3d9077a43c" (UID: "57dd6190-5149-44a9-8a75-7e3d9077a43c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.697834 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx" (OuterVolumeSpecName: "kube-api-access-9wmjx") pod "57dd6190-5149-44a9-8a75-7e3d9077a43c" (UID: "57dd6190-5149-44a9-8a75-7e3d9077a43c"). InnerVolumeSpecName "kube-api-access-9wmjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.758356 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.796771 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dd6190-5149-44a9-8a75-7e3d9077a43c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.796807 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wmjx\" (UniqueName: \"kubernetes.io/projected/57dd6190-5149-44a9-8a75-7e3d9077a43c-kube-api-access-9wmjx\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.897786 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skprq\" (UniqueName: \"kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq\") pod \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.897968 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts\") pod \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\" (UID: \"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0\") " Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.898390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" (UID: "c9f651dd-ff4a-46c9-bd8c-0155be07f0a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.898693 4778 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:18 crc kubenswrapper[4778]: I0318 09:53:18.902130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq" (OuterVolumeSpecName: "kube-api-access-skprq") pod "c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" (UID: "c9f651dd-ff4a-46c9-bd8c-0155be07f0a0"). InnerVolumeSpecName "kube-api-access-skprq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.000734 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skprq\" (UniqueName: \"kubernetes.io/projected/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0-kube-api-access-skprq\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.272063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-j5mf6" event={"ID":"57dd6190-5149-44a9-8a75-7e3d9077a43c","Type":"ContainerDied","Data":"a42c12386bc9161895c66f5178800674e5acdcd3879ece3c4d3b64226793baed"} Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.272095 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-j5mf6" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.272102 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a42c12386bc9161895c66f5178800674e5acdcd3879ece3c4d3b64226793baed" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.273539 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-91af-account-create-update-cc4d5" event={"ID":"c9f651dd-ff4a-46c9-bd8c-0155be07f0a0","Type":"ContainerDied","Data":"4e079b381ae16533a0c4f019135452d3abe9616c161f7f866e0e5b17f9a6ed6d"} Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.273568 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-91af-account-create-update-cc4d5" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.273591 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e079b381ae16533a0c4f019135452d3abe9616c161f7f866e0e5b17f9a6ed6d" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.665012 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 18 09:53:19 crc kubenswrapper[4778]: I0318 09:53:19.725466 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.221098 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-fnlvs"] Mar 18 09:53:20 crc kubenswrapper[4778]: E0318 09:53:20.221750 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57dd6190-5149-44a9-8a75-7e3d9077a43c" containerName="mariadb-database-create" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.221773 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="57dd6190-5149-44a9-8a75-7e3d9077a43c" containerName="mariadb-database-create" Mar 18 09:53:20 crc kubenswrapper[4778]: E0318 09:53:20.221823 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" containerName="mariadb-account-create-update" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.221833 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" containerName="mariadb-account-create-update" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.222066 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" containerName="mariadb-account-create-update" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.222091 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="57dd6190-5149-44a9-8a75-7e3d9077a43c" containerName="mariadb-database-create" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.222832 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.225909 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-fnlvs"] Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.226644 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-t68kd" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.226732 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.329932 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjgrm\" (UniqueName: \"kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.330036 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.330070 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.330102 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.431952 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.432025 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.432075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.432346 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjgrm\" (UniqueName: \"kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.440987 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.441178 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.441260 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.452147 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjgrm\" (UniqueName: \"kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm\") pod \"manila-db-sync-fnlvs\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:20 crc kubenswrapper[4778]: I0318 09:53:20.553742 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:21 crc kubenswrapper[4778]: I0318 09:53:21.104929 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-fnlvs"] Mar 18 09:53:21 crc kubenswrapper[4778]: W0318 09:53:21.114539 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86f9ef2c_6a05_438a_a701_92c9ef84d46d.slice/crio-384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c WatchSource:0}: Error finding container 384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c: Status 404 returned error can't find the container with id 384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c Mar 18 09:53:21 crc kubenswrapper[4778]: I0318 09:53:21.291682 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fnlvs" event={"ID":"86f9ef2c-6a05-438a-a701-92c9ef84d46d","Type":"ContainerStarted","Data":"384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c"} Mar 18 09:53:24 crc kubenswrapper[4778]: I0318 09:53:24.869171 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 18 09:53:24 crc kubenswrapper[4778]: I0318 09:53:24.949642 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.187052 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:53:25 crc kubenswrapper[4778]: E0318 09:53:25.187372 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.458165 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.458401 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.496147 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.516598 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.822678 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.822736 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.861679 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:25 crc kubenswrapper[4778]: I0318 09:53:25.865144 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:26 crc kubenswrapper[4778]: I0318 09:53:26.337336 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:26 crc kubenswrapper[4778]: I0318 09:53:26.337665 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 09:53:26 crc kubenswrapper[4778]: I0318 09:53:26.337680 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:26 crc kubenswrapper[4778]: I0318 09:53:26.337691 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 09:53:27 crc kubenswrapper[4778]: I0318 09:53:27.346810 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fnlvs" event={"ID":"86f9ef2c-6a05-438a-a701-92c9ef84d46d","Type":"ContainerStarted","Data":"c09837191b6ba16c2c4c1ba6934469f4e373f1877eaa0a419ae86d94a526194d"} Mar 18 09:53:27 crc kubenswrapper[4778]: I0318 09:53:27.371542 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-fnlvs" podStartSLOduration=2.499737757 podStartE2EDuration="7.371512563s" podCreationTimestamp="2026-03-18 09:53:20 +0000 UTC" firstStartedPulling="2026-03-18 09:53:21.119072619 +0000 UTC m=+3067.693817459" lastFinishedPulling="2026-03-18 09:53:25.990847425 +0000 UTC m=+3072.565592265" observedRunningTime="2026-03-18 09:53:27.363171606 +0000 UTC m=+3073.937916466" watchObservedRunningTime="2026-03-18 09:53:27.371512563 +0000 UTC m=+3073.946257413" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.394458 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.394833 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.475655 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.475818 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.507765 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 09:53:28 crc kubenswrapper[4778]: I0318 09:53:28.527738 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 09:53:35 crc kubenswrapper[4778]: E0318 09:53:35.817402 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86f9ef2c_6a05_438a_a701_92c9ef84d46d.slice/crio-c09837191b6ba16c2c4c1ba6934469f4e373f1877eaa0a419ae86d94a526194d.scope\": RecentStats: unable to find data in memory cache]" Mar 18 09:53:36 crc kubenswrapper[4778]: I0318 09:53:36.440854 4778 generic.go:334] "Generic (PLEG): container finished" podID="86f9ef2c-6a05-438a-a701-92c9ef84d46d" containerID="c09837191b6ba16c2c4c1ba6934469f4e373f1877eaa0a419ae86d94a526194d" exitCode=0 Mar 18 09:53:36 crc kubenswrapper[4778]: I0318 09:53:36.440901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fnlvs" event={"ID":"86f9ef2c-6a05-438a-a701-92c9ef84d46d","Type":"ContainerDied","Data":"c09837191b6ba16c2c4c1ba6934469f4e373f1877eaa0a419ae86d94a526194d"} Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.902114 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.969743 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjgrm\" (UniqueName: \"kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm\") pod \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.969900 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle\") pod \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.969972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data\") pod \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.970032 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data\") pod \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\" (UID: \"86f9ef2c-6a05-438a-a701-92c9ef84d46d\") " Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.978408 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm" (OuterVolumeSpecName: "kube-api-access-qjgrm") pod "86f9ef2c-6a05-438a-a701-92c9ef84d46d" (UID: "86f9ef2c-6a05-438a-a701-92c9ef84d46d"). InnerVolumeSpecName "kube-api-access-qjgrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.978518 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "86f9ef2c-6a05-438a-a701-92c9ef84d46d" (UID: "86f9ef2c-6a05-438a-a701-92c9ef84d46d"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:37 crc kubenswrapper[4778]: I0318 09:53:37.985577 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data" (OuterVolumeSpecName: "config-data") pod "86f9ef2c-6a05-438a-a701-92c9ef84d46d" (UID: "86f9ef2c-6a05-438a-a701-92c9ef84d46d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.010121 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86f9ef2c-6a05-438a-a701-92c9ef84d46d" (UID: "86f9ef2c-6a05-438a-a701-92c9ef84d46d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.072917 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjgrm\" (UniqueName: \"kubernetes.io/projected/86f9ef2c-6a05-438a-a701-92c9ef84d46d-kube-api-access-qjgrm\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.072961 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.072973 4778 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.072986 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f9ef2c-6a05-438a-a701-92c9ef84d46d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.461601 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fnlvs" event={"ID":"86f9ef2c-6a05-438a-a701-92c9ef84d46d","Type":"ContainerDied","Data":"384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c"} Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.461648 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="384420f85ddf082ceab20b4626cf92ea5c7d09153e3ebeaaa51ba2c61916038c" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.461657 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fnlvs" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.749153 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:53:38 crc kubenswrapper[4778]: E0318 09:53:38.749866 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f9ef2c-6a05-438a-a701-92c9ef84d46d" containerName="manila-db-sync" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.749879 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f9ef2c-6a05-438a-a701-92c9ef84d46d" containerName="manila-db-sync" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.750050 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f9ef2c-6a05-438a-a701-92c9ef84d46d" containerName="manila-db-sync" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.751001 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.752291 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-t68kd" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.752780 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.753457 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.755312 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.759945 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.761771 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.763364 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.770237 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.778382 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.907117 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-pr7j8"] Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.908645 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909787 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909856 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm9bt\" (UniqueName: \"kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909877 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909899 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909914 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909941 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909965 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.909986 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910018 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910054 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910073 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910102 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frmhc\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910118 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.910136 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:38 crc kubenswrapper[4778]: I0318 09:53:38.924164 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-pr7j8"] Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011606 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011657 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011676 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011708 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011735 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011757 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011782 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qbb\" (UniqueName: \"kubernetes.io/projected/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-kube-api-access-85qbb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011802 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011817 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.011937 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012028 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012029 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-config\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012152 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012215 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012215 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012780 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frmhc\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012811 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012832 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012853 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012903 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.012943 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.013003 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm9bt\" (UniqueName: \"kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.018442 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.018886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.019495 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.021361 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.021748 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.022501 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.029063 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.032290 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm9bt\" (UniqueName: \"kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.034423 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.035149 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts\") pod \"manila-scheduler-0\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.050492 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmhc\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc\") pod \"manila-share-share1-0\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.070139 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.080245 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.114986 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.115062 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-config\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.115091 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.115136 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.115242 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.115369 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85qbb\" (UniqueName: \"kubernetes.io/projected/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-kube-api-access-85qbb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.116157 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.116333 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-config\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.116353 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.116559 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.116811 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.135367 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qbb\" (UniqueName: \"kubernetes.io/projected/78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3-kube-api-access-85qbb\") pod \"dnsmasq-dns-69655fd4bf-pr7j8\" (UID: \"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3\") " pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.164271 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.167509 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.183787 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.196674 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.228116 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319311 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319368 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k982x\" (UniqueName: \"kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319392 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319517 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319775 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.319806 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421330 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k982x\" (UniqueName: \"kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421382 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421405 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421436 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421508 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421529 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.421913 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.424277 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.440983 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.442997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.443117 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.444869 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.446575 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k982x\" (UniqueName: \"kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x\") pod \"manila-api-0\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.551776 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.658641 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.691779 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:53:39 crc kubenswrapper[4778]: I0318 09:53:39.842851 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-pr7j8"] Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.187433 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:53:40 crc kubenswrapper[4778]: E0318 09:53:40.188332 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.238000 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.483281 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerStarted","Data":"0b6591d1b2d8334e5cf9a42b169e44b6dfae4bb6ceaf223f510fc2ac1f46c6e1"} Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.485060 4778 generic.go:334] "Generic (PLEG): container finished" podID="78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3" containerID="9f299a31d1798531e91bd8bb58c97e154d95fd9d6d916aab9cc49dafb47ef130" exitCode=0 Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.485121 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" event={"ID":"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3","Type":"ContainerDied","Data":"9f299a31d1798531e91bd8bb58c97e154d95fd9d6d916aab9cc49dafb47ef130"} Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.485147 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" event={"ID":"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3","Type":"ContainerStarted","Data":"ed13e4d0e96a9a590b68371f4ce0af3043bfdeaff584c0a9bdf33c91ee4ce087"} Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.487004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerStarted","Data":"f9a46df295b9504ecd451a63940fff7234f2b6a94c7406f27a604fbc84c81698"} Mar 18 09:53:40 crc kubenswrapper[4778]: I0318 09:53:40.489574 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerStarted","Data":"0707583ce3524176a76a27b3ba5678da894972f75c990dbcdccd8d5c152f9e65"} Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.522379 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" event={"ID":"78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3","Type":"ContainerStarted","Data":"397c511d63225d67b276c060e57fbd9133ca0d18599ab3f217f12da84129d486"} Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.524115 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.539903 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerStarted","Data":"53b04d3a57dbbb5062db7ebeb0b4c01cafe0b4c61a318513e4752f51a74b73e5"} Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.539948 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerStarted","Data":"113274e5db3ab3c7952100755601f5ca794e6a2311c6e4d33ee9cf0fe6058561"} Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.540009 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.544109 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerStarted","Data":"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f"} Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.624336 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" podStartSLOduration=3.622341691 podStartE2EDuration="3.622341691s" podCreationTimestamp="2026-03-18 09:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:41.572430294 +0000 UTC m=+3088.147175134" watchObservedRunningTime="2026-03-18 09:53:41.622341691 +0000 UTC m=+3088.197086531" Mar 18 09:53:41 crc kubenswrapper[4778]: I0318 09:53:41.628668 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=2.628640441 podStartE2EDuration="2.628640441s" podCreationTimestamp="2026-03-18 09:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:41.614796976 +0000 UTC m=+3088.189541826" watchObservedRunningTime="2026-03-18 09:53:41.628640441 +0000 UTC m=+3088.203385281" Mar 18 09:53:42 crc kubenswrapper[4778]: I0318 09:53:42.074161 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:42 crc kubenswrapper[4778]: I0318 09:53:42.559392 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerStarted","Data":"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa"} Mar 18 09:53:42 crc kubenswrapper[4778]: I0318 09:53:42.575917 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.55866214 podStartE2EDuration="4.575898075s" podCreationTimestamp="2026-03-18 09:53:38 +0000 UTC" firstStartedPulling="2026-03-18 09:53:39.70339872 +0000 UTC m=+3086.278143560" lastFinishedPulling="2026-03-18 09:53:40.720634655 +0000 UTC m=+3087.295379495" observedRunningTime="2026-03-18 09:53:42.574482986 +0000 UTC m=+3089.149227826" watchObservedRunningTime="2026-03-18 09:53:42.575898075 +0000 UTC m=+3089.150642915" Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.574705 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api-log" containerID="cri-o://113274e5db3ab3c7952100755601f5ca794e6a2311c6e4d33ee9cf0fe6058561" gracePeriod=30 Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.574755 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api" containerID="cri-o://53b04d3a57dbbb5062db7ebeb0b4c01cafe0b4c61a318513e4752f51a74b73e5" gracePeriod=30 Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.685274 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.685599 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-central-agent" containerID="cri-o://72617aca4aa8400e7af4a60ff63ae0a42982c759ce5821a76119f425b2752b6a" gracePeriod=30 Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.685729 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="sg-core" containerID="cri-o://28ea8f018593b4e5ea693b13884510308b8fe6c9f56dc77f633042aa7b3c2142" gracePeriod=30 Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.685790 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="proxy-httpd" containerID="cri-o://345ccf529d04d20baa9e9e06ea792464e3dc120ee1042052fe45190d2b838b33" gracePeriod=30 Mar 18 09:53:43 crc kubenswrapper[4778]: I0318 09:53:43.685826 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-notification-agent" containerID="cri-o://d0ab691768395396ab9cd2b17a7818f8101a3f17ea12d03f3697e6b6b5c5a0b8" gracePeriod=30 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.585501 4778 generic.go:334] "Generic (PLEG): container finished" podID="9bc20709-2b88-4657-930d-2f893754bc1b" containerID="53b04d3a57dbbb5062db7ebeb0b4c01cafe0b4c61a318513e4752f51a74b73e5" exitCode=0 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.586010 4778 generic.go:334] "Generic (PLEG): container finished" podID="9bc20709-2b88-4657-930d-2f893754bc1b" containerID="113274e5db3ab3c7952100755601f5ca794e6a2311c6e4d33ee9cf0fe6058561" exitCode=143 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.585573 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerDied","Data":"53b04d3a57dbbb5062db7ebeb0b4c01cafe0b4c61a318513e4752f51a74b73e5"} Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.586095 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerDied","Data":"113274e5db3ab3c7952100755601f5ca794e6a2311c6e4d33ee9cf0fe6058561"} Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592685 4778 generic.go:334] "Generic (PLEG): container finished" podID="d6f19125-b59e-49f9-8819-0cad52114840" containerID="345ccf529d04d20baa9e9e06ea792464e3dc120ee1042052fe45190d2b838b33" exitCode=0 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592716 4778 generic.go:334] "Generic (PLEG): container finished" podID="d6f19125-b59e-49f9-8819-0cad52114840" containerID="28ea8f018593b4e5ea693b13884510308b8fe6c9f56dc77f633042aa7b3c2142" exitCode=2 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592727 4778 generic.go:334] "Generic (PLEG): container finished" podID="d6f19125-b59e-49f9-8819-0cad52114840" containerID="d0ab691768395396ab9cd2b17a7818f8101a3f17ea12d03f3697e6b6b5c5a0b8" exitCode=0 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592738 4778 generic.go:334] "Generic (PLEG): container finished" podID="d6f19125-b59e-49f9-8819-0cad52114840" containerID="72617aca4aa8400e7af4a60ff63ae0a42982c759ce5821a76119f425b2752b6a" exitCode=0 Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592761 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerDied","Data":"345ccf529d04d20baa9e9e06ea792464e3dc120ee1042052fe45190d2b838b33"} Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592802 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerDied","Data":"28ea8f018593b4e5ea693b13884510308b8fe6c9f56dc77f633042aa7b3c2142"} Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592815 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerDied","Data":"d0ab691768395396ab9cd2b17a7818f8101a3f17ea12d03f3697e6b6b5c5a0b8"} Mar 18 09:53:44 crc kubenswrapper[4778]: I0318 09:53:44.592825 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerDied","Data":"72617aca4aa8400e7af4a60ff63ae0a42982c759ce5821a76119f425b2752b6a"} Mar 18 09:53:46 crc kubenswrapper[4778]: I0318 09:53:46.980462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.083547 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.101972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k982x\" (UniqueName: \"kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.102626 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.102674 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.102721 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.102919 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.104484 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs" (OuterVolumeSpecName: "logs") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.106018 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.106053 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id\") pod \"9bc20709-2b88-4657-930d-2f893754bc1b\" (UID: \"9bc20709-2b88-4657-930d-2f893754bc1b\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.106828 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x" (OuterVolumeSpecName: "kube-api-access-k982x") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "kube-api-access-k982x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.107572 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.107992 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc20709-2b88-4657-930d-2f893754bc1b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.108766 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc20709-2b88-4657-930d-2f893754bc1b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.108865 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k982x\" (UniqueName: \"kubernetes.io/projected/9bc20709-2b88-4657-930d-2f893754bc1b-kube-api-access-k982x\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.109262 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts" (OuterVolumeSpecName: "scripts") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.118969 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.168709 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.178850 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data" (OuterVolumeSpecName: "config-data") pod "9bc20709-2b88-4657-930d-2f893754bc1b" (UID: "9bc20709-2b88-4657-930d-2f893754bc1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209626 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209710 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209749 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7799h\" (UniqueName: \"kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209779 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209819 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209874 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209899 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209952 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data\") pod \"d6f19125-b59e-49f9-8819-0cad52114840\" (UID: \"d6f19125-b59e-49f9-8819-0cad52114840\") " Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.209984 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.210390 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.210409 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.210419 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.210428 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.210437 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc20709-2b88-4657-930d-2f893754bc1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.211959 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.216065 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts" (OuterVolumeSpecName: "scripts") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.218075 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h" (OuterVolumeSpecName: "kube-api-access-7799h") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "kube-api-access-7799h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.239908 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.276966 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.291359 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312030 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7799h\" (UniqueName: \"kubernetes.io/projected/d6f19125-b59e-49f9-8819-0cad52114840-kube-api-access-7799h\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312070 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6f19125-b59e-49f9-8819-0cad52114840-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312079 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312088 4778 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312098 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.312108 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.324334 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data" (OuterVolumeSpecName: "config-data") pod "d6f19125-b59e-49f9-8819-0cad52114840" (UID: "d6f19125-b59e-49f9-8819-0cad52114840"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.415104 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f19125-b59e-49f9-8819-0cad52114840-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.627125 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.627152 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6f19125-b59e-49f9-8819-0cad52114840","Type":"ContainerDied","Data":"3887f968114761a1654028b0a71448b233a4c1f413820ca59edf51160a07bebd"} Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.627666 4778 scope.go:117] "RemoveContainer" containerID="345ccf529d04d20baa9e9e06ea792464e3dc120ee1042052fe45190d2b838b33" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.630335 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"9bc20709-2b88-4657-930d-2f893754bc1b","Type":"ContainerDied","Data":"f9a46df295b9504ecd451a63940fff7234f2b6a94c7406f27a604fbc84c81698"} Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.630447 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.632706 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerStarted","Data":"563583bf7624c87ff5f7909a2664699f2f91b9c14e7dca73ca0082935cc34469"} Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.655247 4778 scope.go:117] "RemoveContainer" containerID="28ea8f018593b4e5ea693b13884510308b8fe6c9f56dc77f633042aa7b3c2142" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.701280 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.732992 4778 scope.go:117] "RemoveContainer" containerID="d0ab691768395396ab9cd2b17a7818f8101a3f17ea12d03f3697e6b6b5c5a0b8" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.748265 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.779278 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.785103 4778 scope.go:117] "RemoveContainer" containerID="72617aca4aa8400e7af4a60ff63ae0a42982c759ce5821a76119f425b2752b6a" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.793396 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802130 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802662 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="proxy-httpd" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802689 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="proxy-httpd" Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802720 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-notification-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802731 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-notification-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802747 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="sg-core" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802755 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="sg-core" Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802768 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802776 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api" Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802807 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api-log" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802815 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api-log" Mar 18 09:53:47 crc kubenswrapper[4778]: E0318 09:53:47.802830 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-central-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.802838 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-central-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803074 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-notification-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803094 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="ceilometer-central-agent" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803111 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803125 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" containerName="manila-api-log" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803140 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="sg-core" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.803150 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="proxy-httpd" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.804357 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.807845 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.808396 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.809597 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.809676 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.814927 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.816344 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.818806 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.819081 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.819288 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.822571 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.841175 4778 scope.go:117] "RemoveContainer" containerID="53b04d3a57dbbb5062db7ebeb0b4c01cafe0b4c61a318513e4752f51a74b73e5" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.871083 4778 scope.go:117] "RemoveContainer" containerID="113274e5db3ab3c7952100755601f5ca794e6a2311c6e4d33ee9cf0fe6058561" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931090 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931173 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931297 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/35adb68e-2fb0-437c-bea7-e46f05e4918c-etc-machine-id\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931420 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931560 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931628 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data-custom\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931800 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-public-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.931849 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932000 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26w4b\" (UniqueName: \"kubernetes.io/projected/35adb68e-2fb0-437c-bea7-e46f05e4918c-kube-api-access-26w4b\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932132 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35adb68e-2fb0-437c-bea7-e46f05e4918c-logs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932232 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-scripts\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932372 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9scf\" (UniqueName: \"kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932446 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-internal-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932554 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932663 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:47 crc kubenswrapper[4778]: I0318 09:53:47.932708 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034552 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26w4b\" (UniqueName: \"kubernetes.io/projected/35adb68e-2fb0-437c-bea7-e46f05e4918c-kube-api-access-26w4b\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034610 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034626 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35adb68e-2fb0-437c-bea7-e46f05e4918c-logs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034650 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-scripts\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034684 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9scf\" (UniqueName: \"kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034711 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-internal-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034784 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034803 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034819 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034833 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034851 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/35adb68e-2fb0-437c-bea7-e46f05e4918c-etc-machine-id\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034873 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034904 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034926 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data-custom\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034965 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-public-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.034984 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.035840 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35adb68e-2fb0-437c-bea7-e46f05e4918c-logs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.035915 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/35adb68e-2fb0-437c-bea7-e46f05e4918c-etc-machine-id\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.037344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.037688 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.040402 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.040802 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.040997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.042552 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-internal-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.044845 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.045492 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-scripts\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.047165 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.050498 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-public-tls-certs\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.051261 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.052585 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.055401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26w4b\" (UniqueName: \"kubernetes.io/projected/35adb68e-2fb0-437c-bea7-e46f05e4918c-kube-api-access-26w4b\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.059809 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9scf\" (UniqueName: \"kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf\") pod \"ceilometer-0\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.064052 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35adb68e-2fb0-437c-bea7-e46f05e4918c-config-data-custom\") pod \"manila-api-0\" (UID: \"35adb68e-2fb0-437c-bea7-e46f05e4918c\") " pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.138371 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.157042 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.209043 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc20709-2b88-4657-930d-2f893754bc1b" path="/var/lib/kubelet/pods/9bc20709-2b88-4657-930d-2f893754bc1b/volumes" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.210431 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f19125-b59e-49f9-8819-0cad52114840" path="/var/lib/kubelet/pods/d6f19125-b59e-49f9-8819-0cad52114840/volumes" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.647515 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerStarted","Data":"f7b7a17c36a6b78e815b947bec74afda3bb87533ab77861252c363b1152b6474"} Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.733875 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.435959098 podStartE2EDuration="10.733854933s" podCreationTimestamp="2026-03-18 09:53:38 +0000 UTC" firstStartedPulling="2026-03-18 09:53:39.67132803 +0000 UTC m=+3086.246072870" lastFinishedPulling="2026-03-18 09:53:46.969223875 +0000 UTC m=+3093.543968705" observedRunningTime="2026-03-18 09:53:48.671982232 +0000 UTC m=+3095.246727082" watchObservedRunningTime="2026-03-18 09:53:48.733854933 +0000 UTC m=+3095.308599773" Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.736619 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 09:53:48 crc kubenswrapper[4778]: W0318 09:53:48.738041 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35adb68e_2fb0_437c_bea7_e46f05e4918c.slice/crio-e0f26ed27b148995826401877fcd1d9442376ca16ac89a832cd20f9659b631e9 WatchSource:0}: Error finding container e0f26ed27b148995826401877fcd1d9442376ca16ac89a832cd20f9659b631e9: Status 404 returned error can't find the container with id e0f26ed27b148995826401877fcd1d9442376ca16ac89a832cd20f9659b631e9 Mar 18 09:53:48 crc kubenswrapper[4778]: W0318 09:53:48.761535 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65984d06_3c40_407a_b217_0df5cfedcd66.slice/crio-9a4f031251027580940fe7c126f24f661a0ee1e9d08f2164aa143d6e4499255a WatchSource:0}: Error finding container 9a4f031251027580940fe7c126f24f661a0ee1e9d08f2164aa143d6e4499255a: Status 404 returned error can't find the container with id 9a4f031251027580940fe7c126f24f661a0ee1e9d08f2164aa143d6e4499255a Mar 18 09:53:48 crc kubenswrapper[4778]: I0318 09:53:48.766718 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.070293 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.081567 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.230799 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69655fd4bf-pr7j8" Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.361165 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.361719 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="dnsmasq-dns" containerID="cri-o://fd53655c1355ee9e239be226351d5c5537c6593f18ffcbefe53a7ddaf1ea2816" gracePeriod=10 Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.666670 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"35adb68e-2fb0-437c-bea7-e46f05e4918c","Type":"ContainerStarted","Data":"5da9ca9eaa10d047e8e888250aa3be62237c3adcd66ba4e1b6111a63774fe210"} Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.667012 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"35adb68e-2fb0-437c-bea7-e46f05e4918c","Type":"ContainerStarted","Data":"e0f26ed27b148995826401877fcd1d9442376ca16ac89a832cd20f9659b631e9"} Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.668813 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerStarted","Data":"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102"} Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.668859 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerStarted","Data":"9a4f031251027580940fe7c126f24f661a0ee1e9d08f2164aa143d6e4499255a"} Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.671679 4778 generic.go:334] "Generic (PLEG): container finished" podID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerID="fd53655c1355ee9e239be226351d5c5537c6593f18ffcbefe53a7ddaf1ea2816" exitCode=0 Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.671752 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" event={"ID":"5778826c-0b71-4dad-af9c-c7ec7f04aa36","Type":"ContainerDied","Data":"fd53655c1355ee9e239be226351d5c5537c6593f18ffcbefe53a7ddaf1ea2816"} Mar 18 09:53:49 crc kubenswrapper[4778]: I0318 09:53:49.869881 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017330 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017503 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017556 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2gn6\" (UniqueName: \"kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017586 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017629 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.017647 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam\") pod \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\" (UID: \"5778826c-0b71-4dad-af9c-c7ec7f04aa36\") " Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.042457 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6" (OuterVolumeSpecName: "kube-api-access-q2gn6") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "kube-api-access-q2gn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.099623 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.102187 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.116993 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config" (OuterVolumeSpecName: "config") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.120964 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2gn6\" (UniqueName: \"kubernetes.io/projected/5778826c-0b71-4dad-af9c-c7ec7f04aa36-kube-api-access-q2gn6\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.120997 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.121044 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.121058 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.134329 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.176973 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5778826c-0b71-4dad-af9c-c7ec7f04aa36" (UID: "5778826c-0b71-4dad-af9c-c7ec7f04aa36"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.223490 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.223645 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5778826c-0b71-4dad-af9c-c7ec7f04aa36-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.681857 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerStarted","Data":"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75"} Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.684098 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" event={"ID":"5778826c-0b71-4dad-af9c-c7ec7f04aa36","Type":"ContainerDied","Data":"c164a0a62f55fc2358e88475eedc2afcbd5a8934bf23dc30fb39c81fc6c685f2"} Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.684135 4778 scope.go:117] "RemoveContainer" containerID="fd53655c1355ee9e239be226351d5c5537c6593f18ffcbefe53a7ddaf1ea2816" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.684261 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-6qq8b" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.688304 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"35adb68e-2fb0-437c-bea7-e46f05e4918c","Type":"ContainerStarted","Data":"e61e45abcd65218969ff9e52928a7efe4dc2f816fcc6afa936b3d4a046157f25"} Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.688632 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.710265 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.722154 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-6qq8b"] Mar 18 09:53:50 crc kubenswrapper[4778]: I0318 09:53:50.737553 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.737535334 podStartE2EDuration="3.737535334s" podCreationTimestamp="2026-03-18 09:53:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:53:50.724377657 +0000 UTC m=+3097.299122517" watchObservedRunningTime="2026-03-18 09:53:50.737535334 +0000 UTC m=+3097.312280174" Mar 18 09:53:51 crc kubenswrapper[4778]: I0318 09:53:51.175233 4778 scope.go:117] "RemoveContainer" containerID="959408f0646af1976d94cc538a0b42d120c9b802bebf63b681de404e7a6632a0" Mar 18 09:53:52 crc kubenswrapper[4778]: I0318 09:53:52.199644 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" path="/var/lib/kubelet/pods/5778826c-0b71-4dad-af9c-c7ec7f04aa36/volumes" Mar 18 09:53:52 crc kubenswrapper[4778]: I0318 09:53:52.720991 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerStarted","Data":"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f"} Mar 18 09:53:53 crc kubenswrapper[4778]: I0318 09:53:53.351444 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.199520 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:53:54 crc kubenswrapper[4778]: E0318 09:53:54.200509 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.742826 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerStarted","Data":"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106"} Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.743049 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-central-agent" containerID="cri-o://82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102" gracePeriod=30 Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.743123 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.743174 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="sg-core" containerID="cri-o://cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f" gracePeriod=30 Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.743223 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-notification-agent" containerID="cri-o://959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75" gracePeriod=30 Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.743163 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="proxy-httpd" containerID="cri-o://7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106" gracePeriod=30 Mar 18 09:53:54 crc kubenswrapper[4778]: I0318 09:53:54.772743 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9765642420000002 podStartE2EDuration="7.772724384s" podCreationTimestamp="2026-03-18 09:53:47 +0000 UTC" firstStartedPulling="2026-03-18 09:53:48.764266279 +0000 UTC m=+3095.339011119" lastFinishedPulling="2026-03-18 09:53:53.560426421 +0000 UTC m=+3100.135171261" observedRunningTime="2026-03-18 09:53:54.770589757 +0000 UTC m=+3101.345334597" watchObservedRunningTime="2026-03-18 09:53:54.772724384 +0000 UTC m=+3101.347469224" Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.766624 4778 generic.go:334] "Generic (PLEG): container finished" podID="65984d06-3c40-407a-b217-0df5cfedcd66" containerID="7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106" exitCode=0 Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.766995 4778 generic.go:334] "Generic (PLEG): container finished" podID="65984d06-3c40-407a-b217-0df5cfedcd66" containerID="cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f" exitCode=2 Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.767014 4778 generic.go:334] "Generic (PLEG): container finished" podID="65984d06-3c40-407a-b217-0df5cfedcd66" containerID="959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75" exitCode=0 Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.766717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerDied","Data":"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106"} Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.767071 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerDied","Data":"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f"} Mar 18 09:53:55 crc kubenswrapper[4778]: I0318 09:53:55.767096 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerDied","Data":"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75"} Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.120450 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.155563 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.155800 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.155913 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.156008 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.156097 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.156212 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.156442 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9scf\" (UniqueName: \"kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.156557 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle\") pod \"65984d06-3c40-407a-b217-0df5cfedcd66\" (UID: \"65984d06-3c40-407a-b217-0df5cfedcd66\") " Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.157337 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.157352 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.183220 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf" (OuterVolumeSpecName: "kube-api-access-h9scf") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "kube-api-access-h9scf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.197871 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts" (OuterVolumeSpecName: "scripts") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.211584 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.228624 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258864 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9scf\" (UniqueName: \"kubernetes.io/projected/65984d06-3c40-407a-b217-0df5cfedcd66-kube-api-access-h9scf\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258896 4778 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258905 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258914 4778 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258923 4778 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.258931 4778 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65984d06-3c40-407a-b217-0df5cfedcd66-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.265171 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.281414 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data" (OuterVolumeSpecName: "config-data") pod "65984d06-3c40-407a-b217-0df5cfedcd66" (UID: "65984d06-3c40-407a-b217-0df5cfedcd66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.360427 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.360711 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65984d06-3c40-407a-b217-0df5cfedcd66-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.776860 4778 generic.go:334] "Generic (PLEG): container finished" podID="65984d06-3c40-407a-b217-0df5cfedcd66" containerID="82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102" exitCode=0 Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.776902 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerDied","Data":"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102"} Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.776936 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65984d06-3c40-407a-b217-0df5cfedcd66","Type":"ContainerDied","Data":"9a4f031251027580940fe7c126f24f661a0ee1e9d08f2164aa143d6e4499255a"} Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.776954 4778 scope.go:117] "RemoveContainer" containerID="7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.778046 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.805633 4778 scope.go:117] "RemoveContainer" containerID="cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.832257 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.857034 4778 scope.go:117] "RemoveContainer" containerID="959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.869360 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.880509 4778 scope.go:117] "RemoveContainer" containerID="82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883370 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883766 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-central-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883782 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-central-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883798 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="sg-core" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883805 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="sg-core" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883830 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="init" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883836 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="init" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883843 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-notification-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883849 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-notification-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883863 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="dnsmasq-dns" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883868 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="dnsmasq-dns" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.883882 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="proxy-httpd" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.883888 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="proxy-httpd" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.884051 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="sg-core" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.884066 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-notification-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.884075 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="proxy-httpd" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.884087 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" containerName="ceilometer-central-agent" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.884099 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5778826c-0b71-4dad-af9c-c7ec7f04aa36" containerName="dnsmasq-dns" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.885785 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.888450 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.888986 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.896082 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.896677 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.919454 4778 scope.go:117] "RemoveContainer" containerID="7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.921311 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106\": container with ID starting with 7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106 not found: ID does not exist" containerID="7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.921420 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106"} err="failed to get container status \"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106\": rpc error: code = NotFound desc = could not find container \"7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106\": container with ID starting with 7d40e1cefe5a4ec8aa6e895396af0193c4ee994336c5bfcbc3df4ec438797106 not found: ID does not exist" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.921499 4778 scope.go:117] "RemoveContainer" containerID="cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.922034 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f\": container with ID starting with cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f not found: ID does not exist" containerID="cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.922150 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f"} err="failed to get container status \"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f\": rpc error: code = NotFound desc = could not find container \"cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f\": container with ID starting with cd24ff8656ce13322c9d17ddbeafe296e7918a7159e22d491383c4f5f509975f not found: ID does not exist" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.922380 4778 scope.go:117] "RemoveContainer" containerID="959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.922694 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75\": container with ID starting with 959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75 not found: ID does not exist" containerID="959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.922778 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75"} err="failed to get container status \"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75\": rpc error: code = NotFound desc = could not find container \"959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75\": container with ID starting with 959a39d2ee5dd1befd5af0cad0fa1ede44e0ca4449667a1e908ed8f388703c75 not found: ID does not exist" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.922848 4778 scope.go:117] "RemoveContainer" containerID="82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102" Mar 18 09:53:56 crc kubenswrapper[4778]: E0318 09:53:56.923118 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102\": container with ID starting with 82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102 not found: ID does not exist" containerID="82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.923229 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102"} err="failed to get container status \"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102\": rpc error: code = NotFound desc = could not find container \"82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102\": container with ID starting with 82d3b9ff29de5400c24c29c03b205075b31053a0bbeed1d4930c08ca2052a102 not found: ID does not exist" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969147 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-log-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969176 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969309 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-run-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-scripts\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969397 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969430 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-config-data\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:56 crc kubenswrapper[4778]: I0318 09:53:56.969461 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwdff\" (UniqueName: \"kubernetes.io/projected/52bc493f-72e9-4387-9b91-13343fc7d550-kube-api-access-lwdff\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071360 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071422 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-config-data\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071458 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwdff\" (UniqueName: \"kubernetes.io/projected/52bc493f-72e9-4387-9b91-13343fc7d550-kube-api-access-lwdff\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071512 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071551 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-log-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071578 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071661 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-run-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.071693 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-scripts\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.072399 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-log-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.072426 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52bc493f-72e9-4387-9b91-13343fc7d550-run-httpd\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.077419 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.079692 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.080185 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-scripts\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.080328 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.088467 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bc493f-72e9-4387-9b91-13343fc7d550-config-data\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.092020 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwdff\" (UniqueName: \"kubernetes.io/projected/52bc493f-72e9-4387-9b91-13343fc7d550-kube-api-access-lwdff\") pod \"ceilometer-0\" (UID: \"52bc493f-72e9-4387-9b91-13343fc7d550\") " pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.206642 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.700643 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 09:53:57 crc kubenswrapper[4778]: W0318 09:53:57.717277 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52bc493f_72e9_4387_9b91_13343fc7d550.slice/crio-4377686f9d70c59c5780a5e8b27cbaa293811031de0dc9240ed95d1fd84a4809 WatchSource:0}: Error finding container 4377686f9d70c59c5780a5e8b27cbaa293811031de0dc9240ed95d1fd84a4809: Status 404 returned error can't find the container with id 4377686f9d70c59c5780a5e8b27cbaa293811031de0dc9240ed95d1fd84a4809 Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.720118 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:53:57 crc kubenswrapper[4778]: I0318 09:53:57.787704 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52bc493f-72e9-4387-9b91-13343fc7d550","Type":"ContainerStarted","Data":"4377686f9d70c59c5780a5e8b27cbaa293811031de0dc9240ed95d1fd84a4809"} Mar 18 09:53:58 crc kubenswrapper[4778]: I0318 09:53:58.199920 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65984d06-3c40-407a-b217-0df5cfedcd66" path="/var/lib/kubelet/pods/65984d06-3c40-407a-b217-0df5cfedcd66/volumes" Mar 18 09:53:58 crc kubenswrapper[4778]: I0318 09:53:58.825503 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52bc493f-72e9-4387-9b91-13343fc7d550","Type":"ContainerStarted","Data":"9231b7e8988076a1286ece2bdfd759ab5abea7b27b2046e1f79083b6d9028063"} Mar 18 09:53:59 crc kubenswrapper[4778]: I0318 09:53:59.845423 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52bc493f-72e9-4387-9b91-13343fc7d550","Type":"ContainerStarted","Data":"4b72b1f86dd6b64423280129d67be8592ca5209a7d043c97a2057d96e1a00d59"} Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.149396 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563794-586xm"] Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.154828 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.158534 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.160461 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.162624 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563794-586xm"] Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.164681 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.246316 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmdw\" (UniqueName: \"kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw\") pod \"auto-csr-approver-29563794-586xm\" (UID: \"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3\") " pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.347918 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmdw\" (UniqueName: \"kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw\") pod \"auto-csr-approver-29563794-586xm\" (UID: \"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3\") " pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.368207 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmdw\" (UniqueName: \"kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw\") pod \"auto-csr-approver-29563794-586xm\" (UID: \"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3\") " pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.473698 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.681180 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.696381 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.750297 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.792001 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.858366 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="manila-share" containerID="cri-o://563583bf7624c87ff5f7909a2664699f2f91b9c14e7dca73ca0082935cc34469" gracePeriod=30 Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.858935 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="probe" containerID="cri-o://f7b7a17c36a6b78e815b947bec74afda3bb87533ab77861252c363b1152b6474" gracePeriod=30 Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.859032 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52bc493f-72e9-4387-9b91-13343fc7d550","Type":"ContainerStarted","Data":"901bb6ee88851abc816df355546ae1f36ba920b0ededf032f2250d52cb67dc93"} Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.859282 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="manila-scheduler" containerID="cri-o://f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f" gracePeriod=30 Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.859434 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="probe" containerID="cri-o://45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa" gracePeriod=30 Mar 18 09:54:00 crc kubenswrapper[4778]: I0318 09:54:00.961850 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563794-586xm"] Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.881173 4778 generic.go:334] "Generic (PLEG): container finished" podID="3b330794-936e-4249-a59f-5c68279f210d" containerID="f7b7a17c36a6b78e815b947bec74afda3bb87533ab77861252c363b1152b6474" exitCode=0 Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.882033 4778 generic.go:334] "Generic (PLEG): container finished" podID="3b330794-936e-4249-a59f-5c68279f210d" containerID="563583bf7624c87ff5f7909a2664699f2f91b9c14e7dca73ca0082935cc34469" exitCode=1 Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.881233 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerDied","Data":"f7b7a17c36a6b78e815b947bec74afda3bb87533ab77861252c363b1152b6474"} Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.882094 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerDied","Data":"563583bf7624c87ff5f7909a2664699f2f91b9c14e7dca73ca0082935cc34469"} Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.883647 4778 generic.go:334] "Generic (PLEG): container finished" podID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerID="45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa" exitCode=0 Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.883689 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerDied","Data":"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa"} Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.884353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563794-586xm" event={"ID":"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3","Type":"ContainerStarted","Data":"2ff6ff811698c9908d7bd73b0e4fa290162f5b8af6b3b1b1a64cc943bb4b1185"} Mar 18 09:54:01 crc kubenswrapper[4778]: I0318 09:54:01.989366 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.084894 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.084975 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.085086 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.085788 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.085931 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.085956 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frmhc\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086042 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086060 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom\") pod \"3b330794-936e-4249-a59f-5c68279f210d\" (UID: \"3b330794-936e-4249-a59f-5c68279f210d\") " Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086337 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086601 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086807 4778 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.086825 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b330794-936e-4249-a59f-5c68279f210d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.093614 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts" (OuterVolumeSpecName: "scripts") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.093679 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph" (OuterVolumeSpecName: "ceph") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.096178 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc" (OuterVolumeSpecName: "kube-api-access-frmhc") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "kube-api-access-frmhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.107097 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.144912 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.188599 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frmhc\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-kube-api-access-frmhc\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.188633 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3b330794-936e-4249-a59f-5c68279f210d-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.188646 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.188654 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.188665 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.197446 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data" (OuterVolumeSpecName: "config-data") pod "3b330794-936e-4249-a59f-5c68279f210d" (UID: "3b330794-936e-4249-a59f-5c68279f210d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.291013 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b330794-936e-4249-a59f-5c68279f210d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.895710 4778 generic.go:334] "Generic (PLEG): container finished" podID="c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" containerID="1887b38177a6dc3b69f09e9dc6a6dd26a61cf63c5f532cbb5b0e04e1fb5a3d8b" exitCode=0 Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.895788 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563794-586xm" event={"ID":"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3","Type":"ContainerDied","Data":"1887b38177a6dc3b69f09e9dc6a6dd26a61cf63c5f532cbb5b0e04e1fb5a3d8b"} Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.900666 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3b330794-936e-4249-a59f-5c68279f210d","Type":"ContainerDied","Data":"0707583ce3524176a76a27b3ba5678da894972f75c990dbcdccd8d5c152f9e65"} Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.900721 4778 scope.go:117] "RemoveContainer" containerID="f7b7a17c36a6b78e815b947bec74afda3bb87533ab77861252c363b1152b6474" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.900882 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.904437 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52bc493f-72e9-4387-9b91-13343fc7d550","Type":"ContainerStarted","Data":"23ac028505c473e713082990956c9259c6dd33bd7cddc2fa1c7586ba732d4f84"} Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.904625 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.930944 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.853640727 podStartE2EDuration="6.93092168s" podCreationTimestamp="2026-03-18 09:53:56 +0000 UTC" firstStartedPulling="2026-03-18 09:53:57.719939519 +0000 UTC m=+3104.294684349" lastFinishedPulling="2026-03-18 09:54:01.797220462 +0000 UTC m=+3108.371965302" observedRunningTime="2026-03-18 09:54:02.930800887 +0000 UTC m=+3109.505545747" watchObservedRunningTime="2026-03-18 09:54:02.93092168 +0000 UTC m=+3109.505666530" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.951765 4778 scope.go:117] "RemoveContainer" containerID="563583bf7624c87ff5f7909a2664699f2f91b9c14e7dca73ca0082935cc34469" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.960167 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.972654 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.998330 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:02 crc kubenswrapper[4778]: E0318 09:54:02.998834 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="probe" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.998854 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="probe" Mar 18 09:54:02 crc kubenswrapper[4778]: E0318 09:54:02.998892 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="manila-share" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.998913 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="manila-share" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.999117 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="manila-share" Mar 18 09:54:02 crc kubenswrapper[4778]: I0318 09:54:02.999140 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b330794-936e-4249-a59f-5c68279f210d" containerName="probe" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.001936 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.010888 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.013523 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.129716 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130181 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130276 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-scripts\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130392 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130429 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-ceph\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130531 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130614 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.130650 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fddh\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-kube-api-access-8fddh\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232516 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232577 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-ceph\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232619 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232669 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232699 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fddh\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-kube-api-access-8fddh\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232780 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232837 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232888 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/821dda0e-cde2-45a4-b23a-3d13565be515-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.232928 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-scripts\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.238507 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.239321 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-ceph\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.250880 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-scripts\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.251000 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.251612 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821dda0e-cde2-45a4-b23a-3d13565be515-config-data\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.253690 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fddh\" (UniqueName: \"kubernetes.io/projected/821dda0e-cde2-45a4-b23a-3d13565be515-kube-api-access-8fddh\") pod \"manila-share-share1-0\" (UID: \"821dda0e-cde2-45a4-b23a-3d13565be515\") " pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.347615 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.903855 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.928341 4778 generic.go:334] "Generic (PLEG): container finished" podID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerID="f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f" exitCode=0 Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.928787 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerDied","Data":"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f"} Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.928806 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.928824 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e96ea8ef-858b-421c-9b80-c8e76e2bc368","Type":"ContainerDied","Data":"0b6591d1b2d8334e5cf9a42b169e44b6dfae4bb6ceaf223f510fc2ac1f46c6e1"} Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.928846 4778 scope.go:117] "RemoveContainer" containerID="45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.953702 4778 scope.go:117] "RemoveContainer" containerID="f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f" Mar 18 09:54:03 crc kubenswrapper[4778]: I0318 09:54:03.974049 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.007119 4778 scope.go:117] "RemoveContainer" containerID="45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa" Mar 18 09:54:04 crc kubenswrapper[4778]: E0318 09:54:04.008378 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa\": container with ID starting with 45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa not found: ID does not exist" containerID="45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.008427 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa"} err="failed to get container status \"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa\": rpc error: code = NotFound desc = could not find container \"45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa\": container with ID starting with 45eb4ed629f9b2343d2c289d2eb2dceba9a29eba8910e349ae8d3229c19fbdaa not found: ID does not exist" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.008458 4778 scope.go:117] "RemoveContainer" containerID="f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f" Mar 18 09:54:04 crc kubenswrapper[4778]: E0318 09:54:04.009316 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f\": container with ID starting with f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f not found: ID does not exist" containerID="f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.009351 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f"} err="failed to get container status \"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f\": rpc error: code = NotFound desc = could not find container \"f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f\": container with ID starting with f4a0b64a14df531a1d4d379d6e3426d81a613c9badd062eabe48fbd69c1b6b8f not found: ID does not exist" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.045572 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.045710 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.045782 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.045986 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.046062 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.046132 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm9bt\" (UniqueName: \"kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt\") pod \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\" (UID: \"e96ea8ef-858b-421c-9b80-c8e76e2bc368\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.046463 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.047470 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96ea8ef-858b-421c-9b80-c8e76e2bc368-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.055721 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts" (OuterVolumeSpecName: "scripts") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.055896 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt" (OuterVolumeSpecName: "kube-api-access-qm9bt") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "kube-api-access-qm9bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.057438 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.109169 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.149851 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.149890 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.149900 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.149909 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm9bt\" (UniqueName: \"kubernetes.io/projected/e96ea8ef-858b-421c-9b80-c8e76e2bc368-kube-api-access-qm9bt\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.163856 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data" (OuterVolumeSpecName: "config-data") pod "e96ea8ef-858b-421c-9b80-c8e76e2bc368" (UID: "e96ea8ef-858b-421c-9b80-c8e76e2bc368"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.179705 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.203715 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b330794-936e-4249-a59f-5c68279f210d" path="/var/lib/kubelet/pods/3b330794-936e-4249-a59f-5c68279f210d/volumes" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.255969 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjmdw\" (UniqueName: \"kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw\") pod \"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3\" (UID: \"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3\") " Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.256479 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96ea8ef-858b-421c-9b80-c8e76e2bc368-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.267043 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw" (OuterVolumeSpecName: "kube-api-access-cjmdw") pod "c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" (UID: "c681ab9e-5bfe-4e10-9154-41ff0c5d76a3"). InnerVolumeSpecName "kube-api-access-cjmdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.289835 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.306254 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.323483 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:04 crc kubenswrapper[4778]: E0318 09:54:04.323972 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" containerName="oc" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.323990 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" containerName="oc" Mar 18 09:54:04 crc kubenswrapper[4778]: E0318 09:54:04.324018 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="manila-scheduler" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.324025 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="manila-scheduler" Mar 18 09:54:04 crc kubenswrapper[4778]: E0318 09:54:04.324044 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="probe" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.324050 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="probe" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.324236 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="probe" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.324249 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" containerName="oc" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.324268 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" containerName="manila-scheduler" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.325284 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.327029 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.333738 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.358366 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjmdw\" (UniqueName: \"kubernetes.io/projected/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3-kube-api-access-cjmdw\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.459773 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rfzk\" (UniqueName: \"kubernetes.io/projected/e9af702d-3a1a-490e-82f5-e99c1718ef83-kube-api-access-9rfzk\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.459814 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.459852 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.460025 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.460075 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-scripts\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.460171 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9af702d-3a1a-490e-82f5-e99c1718ef83-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.561739 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.561807 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-scripts\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.561880 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9af702d-3a1a-490e-82f5-e99c1718ef83-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.562174 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9af702d-3a1a-490e-82f5-e99c1718ef83-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.563395 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rfzk\" (UniqueName: \"kubernetes.io/projected/e9af702d-3a1a-490e-82f5-e99c1718ef83-kube-api-access-9rfzk\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.563453 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.563527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.566937 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.567659 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-scripts\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.567997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-config-data\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.570020 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9af702d-3a1a-490e-82f5-e99c1718ef83-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.578461 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rfzk\" (UniqueName: \"kubernetes.io/projected/e9af702d-3a1a-490e-82f5-e99c1718ef83-kube-api-access-9rfzk\") pod \"manila-scheduler-0\" (UID: \"e9af702d-3a1a-490e-82f5-e99c1718ef83\") " pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.648653 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.945754 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"821dda0e-cde2-45a4-b23a-3d13565be515","Type":"ContainerStarted","Data":"f8b2c3f2e882c8abefe1e9ec213d9e0ef5d62840b580d9e9b2da69d98ca7862e"} Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.946135 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"821dda0e-cde2-45a4-b23a-3d13565be515","Type":"ContainerStarted","Data":"13a4a6443261acc0962867a8d25652877daad9499fc8b57c1197b030736506ad"} Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.960847 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563794-586xm" event={"ID":"c681ab9e-5bfe-4e10-9154-41ff0c5d76a3","Type":"ContainerDied","Data":"2ff6ff811698c9908d7bd73b0e4fa290162f5b8af6b3b1b1a64cc943bb4b1185"} Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.960896 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ff6ff811698c9908d7bd73b0e4fa290162f5b8af6b3b1b1a64cc943bb4b1185" Mar 18 09:54:04 crc kubenswrapper[4778]: I0318 09:54:04.960899 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563794-586xm" Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.085954 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.262664 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563788-pctk8"] Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.277140 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563788-pctk8"] Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.974370 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"821dda0e-cde2-45a4-b23a-3d13565be515","Type":"ContainerStarted","Data":"005537f2c20835e85e19ce1f6f5d8263c3e3bedced6cc69c20f843c7e57240f4"} Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.977309 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e9af702d-3a1a-490e-82f5-e99c1718ef83","Type":"ContainerStarted","Data":"724343d6dafc84c84a20658930d32f9fff855b3c1742e0f3e69c272f180b5e91"} Mar 18 09:54:05 crc kubenswrapper[4778]: I0318 09:54:05.977365 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e9af702d-3a1a-490e-82f5-e99c1718ef83","Type":"ContainerStarted","Data":"4fb077966a80a7907254d618f7b92ea4687f9b732c468fd32aab1668fb8f0977"} Mar 18 09:54:06 crc kubenswrapper[4778]: I0318 09:54:06.006930 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.006909873 podStartE2EDuration="4.006909873s" podCreationTimestamp="2026-03-18 09:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:54:05.995627726 +0000 UTC m=+3112.570372576" watchObservedRunningTime="2026-03-18 09:54:06.006909873 +0000 UTC m=+3112.581654713" Mar 18 09:54:06 crc kubenswrapper[4778]: I0318 09:54:06.202507 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc64d6e3-ed19-4365-ab83-8c1af026054b" path="/var/lib/kubelet/pods/dc64d6e3-ed19-4365-ab83-8c1af026054b/volumes" Mar 18 09:54:06 crc kubenswrapper[4778]: I0318 09:54:06.203497 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e96ea8ef-858b-421c-9b80-c8e76e2bc368" path="/var/lib/kubelet/pods/e96ea8ef-858b-421c-9b80-c8e76e2bc368/volumes" Mar 18 09:54:06 crc kubenswrapper[4778]: I0318 09:54:06.988273 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e9af702d-3a1a-490e-82f5-e99c1718ef83","Type":"ContainerStarted","Data":"1a768cf698887e93db294d8cec66875bde6dcd3ae27a64b772641d4061d0d170"} Mar 18 09:54:07 crc kubenswrapper[4778]: I0318 09:54:07.023510 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.023493439 podStartE2EDuration="3.023493439s" podCreationTimestamp="2026-03-18 09:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:54:07.012134491 +0000 UTC m=+3113.586879341" watchObservedRunningTime="2026-03-18 09:54:07.023493439 +0000 UTC m=+3113.598238279" Mar 18 09:54:07 crc kubenswrapper[4778]: I0318 09:54:07.187407 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:54:07 crc kubenswrapper[4778]: E0318 09:54:07.187651 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:54:09 crc kubenswrapper[4778]: I0318 09:54:09.453090 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 18 09:54:13 crc kubenswrapper[4778]: I0318 09:54:13.348150 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 18 09:54:14 crc kubenswrapper[4778]: I0318 09:54:14.649314 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 18 09:54:15 crc kubenswrapper[4778]: I0318 09:54:15.910524 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d6f19125-b59e-49f9-8819-0cad52114840" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.202:3000/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 09:54:18 crc kubenswrapper[4778]: I0318 09:54:18.187234 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:54:18 crc kubenswrapper[4778]: E0318 09:54:18.187803 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:54:25 crc kubenswrapper[4778]: I0318 09:54:25.042273 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 18 09:54:26 crc kubenswrapper[4778]: I0318 09:54:26.131934 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 18 09:54:27 crc kubenswrapper[4778]: I0318 09:54:27.221031 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 09:54:32 crc kubenswrapper[4778]: I0318 09:54:32.187628 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:54:32 crc kubenswrapper[4778]: E0318 09:54:32.188511 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:54:33 crc kubenswrapper[4778]: I0318 09:54:33.006659 4778 scope.go:117] "RemoveContainer" containerID="12a11cfc29f7b65306c1684f9c90110c5f5f19bee2195c78cf1dbf6c7f4120dd" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.252278 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.254416 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.276613 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.315956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.316029 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.316126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbfl6\" (UniqueName: \"kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.418400 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.418480 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.418586 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbfl6\" (UniqueName: \"kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.419235 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.419451 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.442068 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbfl6\" (UniqueName: \"kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6\") pod \"certified-operators-wkp9p\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:34 crc kubenswrapper[4778]: I0318 09:54:34.620073 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:35 crc kubenswrapper[4778]: I0318 09:54:35.103855 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:35 crc kubenswrapper[4778]: I0318 09:54:35.325995 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerStarted","Data":"955fb0461de04ec66a7e60e5a7e268421dad9143ca3957a537dd0edcadbc1fc7"} Mar 18 09:54:36 crc kubenswrapper[4778]: I0318 09:54:36.339313 4778 generic.go:334] "Generic (PLEG): container finished" podID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerID="18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90" exitCode=0 Mar 18 09:54:36 crc kubenswrapper[4778]: I0318 09:54:36.339447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerDied","Data":"18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90"} Mar 18 09:54:37 crc kubenswrapper[4778]: I0318 09:54:37.351191 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerStarted","Data":"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8"} Mar 18 09:54:38 crc kubenswrapper[4778]: I0318 09:54:38.364087 4778 generic.go:334] "Generic (PLEG): container finished" podID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerID="5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8" exitCode=0 Mar 18 09:54:38 crc kubenswrapper[4778]: I0318 09:54:38.364459 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerDied","Data":"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8"} Mar 18 09:54:39 crc kubenswrapper[4778]: I0318 09:54:39.379316 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerStarted","Data":"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72"} Mar 18 09:54:44 crc kubenswrapper[4778]: I0318 09:54:44.620243 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:44 crc kubenswrapper[4778]: I0318 09:54:44.620597 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:44 crc kubenswrapper[4778]: I0318 09:54:44.690430 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:44 crc kubenswrapper[4778]: I0318 09:54:44.717175 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wkp9p" podStartSLOduration=8.29297137 podStartE2EDuration="10.717152516s" podCreationTimestamp="2026-03-18 09:54:34 +0000 UTC" firstStartedPulling="2026-03-18 09:54:36.342566561 +0000 UTC m=+3142.917311411" lastFinishedPulling="2026-03-18 09:54:38.766747717 +0000 UTC m=+3145.341492557" observedRunningTime="2026-03-18 09:54:39.403921986 +0000 UTC m=+3145.978666856" watchObservedRunningTime="2026-03-18 09:54:44.717152516 +0000 UTC m=+3151.291897376" Mar 18 09:54:45 crc kubenswrapper[4778]: I0318 09:54:45.527288 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:45 crc kubenswrapper[4778]: I0318 09:54:45.580667 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:47 crc kubenswrapper[4778]: I0318 09:54:47.187150 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:54:47 crc kubenswrapper[4778]: E0318 09:54:47.187633 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:54:47 crc kubenswrapper[4778]: I0318 09:54:47.485803 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wkp9p" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="registry-server" containerID="cri-o://92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72" gracePeriod=2 Mar 18 09:54:47 crc kubenswrapper[4778]: I0318 09:54:47.998751 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.189507 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content\") pod \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.189598 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbfl6\" (UniqueName: \"kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6\") pod \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.189668 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities\") pod \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\" (UID: \"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e\") " Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.190999 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities" (OuterVolumeSpecName: "utilities") pod "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" (UID: "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.202686 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6" (OuterVolumeSpecName: "kube-api-access-bbfl6") pod "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" (UID: "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e"). InnerVolumeSpecName "kube-api-access-bbfl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.240443 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" (UID: "21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.293376 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.293487 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbfl6\" (UniqueName: \"kubernetes.io/projected/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-kube-api-access-bbfl6\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.293509 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.497862 4778 generic.go:334] "Generic (PLEG): container finished" podID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerID="92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72" exitCode=0 Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.497906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerDied","Data":"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72"} Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.497946 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkp9p" event={"ID":"21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e","Type":"ContainerDied","Data":"955fb0461de04ec66a7e60e5a7e268421dad9143ca3957a537dd0edcadbc1fc7"} Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.497970 4778 scope.go:117] "RemoveContainer" containerID="92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.497990 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkp9p" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.544003 4778 scope.go:117] "RemoveContainer" containerID="5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.559038 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.578096 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wkp9p"] Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.586470 4778 scope.go:117] "RemoveContainer" containerID="18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.616760 4778 scope.go:117] "RemoveContainer" containerID="92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72" Mar 18 09:54:48 crc kubenswrapper[4778]: E0318 09:54:48.617176 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72\": container with ID starting with 92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72 not found: ID does not exist" containerID="92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.617233 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72"} err="failed to get container status \"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72\": rpc error: code = NotFound desc = could not find container \"92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72\": container with ID starting with 92e7d1ac3cdfb84731fa96b773086dae264d3cfdf3b6b8aaf5bcf886a466cd72 not found: ID does not exist" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.617259 4778 scope.go:117] "RemoveContainer" containerID="5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8" Mar 18 09:54:48 crc kubenswrapper[4778]: E0318 09:54:48.617572 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8\": container with ID starting with 5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8 not found: ID does not exist" containerID="5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.617610 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8"} err="failed to get container status \"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8\": rpc error: code = NotFound desc = could not find container \"5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8\": container with ID starting with 5b6779a995a8a68505266a65a0a40cc696c0880cb1f3e7d587719ca09c2642e8 not found: ID does not exist" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.617630 4778 scope.go:117] "RemoveContainer" containerID="18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90" Mar 18 09:54:48 crc kubenswrapper[4778]: E0318 09:54:48.617919 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90\": container with ID starting with 18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90 not found: ID does not exist" containerID="18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90" Mar 18 09:54:48 crc kubenswrapper[4778]: I0318 09:54:48.617962 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90"} err="failed to get container status \"18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90\": rpc error: code = NotFound desc = could not find container \"18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90\": container with ID starting with 18cf938f15fea54a9debe3bfea5cb4096778b596da2a5d635043b6db04c64d90 not found: ID does not exist" Mar 18 09:54:50 crc kubenswrapper[4778]: I0318 09:54:50.201485 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" path="/var/lib/kubelet/pods/21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e/volumes" Mar 18 09:54:59 crc kubenswrapper[4778]: I0318 09:54:59.187499 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:54:59 crc kubenswrapper[4778]: E0318 09:54:59.188332 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:55:11 crc kubenswrapper[4778]: I0318 09:55:11.187356 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:55:11 crc kubenswrapper[4778]: E0318 09:55:11.188293 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.195369 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:55:24 crc kubenswrapper[4778]: E0318 09:55:24.197060 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.341514 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Mar 18 09:55:24 crc kubenswrapper[4778]: E0318 09:55:24.342061 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="extract-content" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.342092 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="extract-content" Mar 18 09:55:24 crc kubenswrapper[4778]: E0318 09:55:24.342125 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="extract-utilities" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.342137 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="extract-utilities" Mar 18 09:55:24 crc kubenswrapper[4778]: E0318 09:55:24.342189 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="registry-server" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.342224 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="registry-server" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.342549 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="21cb4860-c3a3-4ce2-aec6-fe7a7ce0442e" containerName="registry-server" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.343524 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.346899 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-htxt6" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.347544 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.352556 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.352665 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.368513 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433402 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433478 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433603 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c89k6\" (UniqueName: \"kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433660 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433727 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433778 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433868 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.433958 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.434025 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.434076 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536028 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536103 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536159 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536269 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536304 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536382 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c89k6\" (UniqueName: \"kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536420 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536472 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536509 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536571 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.536995 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.537269 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.537464 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.537737 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.537886 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.542815 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.548800 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.551436 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.552790 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.558863 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c89k6\" (UniqueName: \"kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.570608 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:24 crc kubenswrapper[4778]: I0318 09:55:24.684112 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Mar 18 09:55:25 crc kubenswrapper[4778]: I0318 09:55:25.221156 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Mar 18 09:55:25 crc kubenswrapper[4778]: I0318 09:55:25.863575 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"757e3758-d646-4267-8c4c-b5efb0dcf709","Type":"ContainerStarted","Data":"dcac98cd78d62b2f03dd429a022d38c29d36c13fc170b830ecbd627ba6023d27"} Mar 18 09:55:35 crc kubenswrapper[4778]: I0318 09:55:35.187907 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:55:35 crc kubenswrapper[4778]: E0318 09:55:35.188746 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:55:46 crc kubenswrapper[4778]: I0318 09:55:46.187015 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:55:46 crc kubenswrapper[4778]: E0318 09:55:46.187835 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:55:52 crc kubenswrapper[4778]: E0318 09:55:52.791482 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 18 09:55:52 crc kubenswrapper[4778]: E0318 09:55:52.792302 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c89k6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest-s00-full_openstack(757e3758-d646-4267-8c4c-b5efb0dcf709): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:55:52 crc kubenswrapper[4778]: E0318 09:55:52.793514 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest-s00-full" podUID="757e3758-d646-4267-8c4c-b5efb0dcf709" Mar 18 09:55:53 crc kubenswrapper[4778]: E0318 09:55:53.134362 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest-s00-full" podUID="757e3758-d646-4267-8c4c-b5efb0dcf709" Mar 18 09:55:59 crc kubenswrapper[4778]: I0318 09:55:59.186790 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:55:59 crc kubenswrapper[4778]: E0318 09:55:59.187831 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.160460 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563796-r7bfh"] Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.162152 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.164351 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.165359 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.166611 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.211384 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563796-r7bfh"] Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.246768 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v2p9\" (UniqueName: \"kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9\") pod \"auto-csr-approver-29563796-r7bfh\" (UID: \"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7\") " pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.351012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v2p9\" (UniqueName: \"kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9\") pod \"auto-csr-approver-29563796-r7bfh\" (UID: \"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7\") " pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.370879 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v2p9\" (UniqueName: \"kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9\") pod \"auto-csr-approver-29563796-r7bfh\" (UID: \"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7\") " pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:00 crc kubenswrapper[4778]: I0318 09:56:00.511686 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:01 crc kubenswrapper[4778]: I0318 09:56:00.997638 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563796-r7bfh"] Mar 18 09:56:01 crc kubenswrapper[4778]: I0318 09:56:01.213326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" event={"ID":"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7","Type":"ContainerStarted","Data":"6045c1a6752ab8c67ab2035fa4d5ea43d54f98f6650cf09fa319ce731eca27f7"} Mar 18 09:56:03 crc kubenswrapper[4778]: I0318 09:56:03.243269 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" event={"ID":"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7","Type":"ContainerStarted","Data":"e43c5819f1ce670a94fb282c0d10a53cccd0c70528dde0f82e60b4168a0b1dd9"} Mar 18 09:56:03 crc kubenswrapper[4778]: I0318 09:56:03.274273 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" podStartSLOduration=1.5641289760000001 podStartE2EDuration="3.274250342s" podCreationTimestamp="2026-03-18 09:56:00 +0000 UTC" firstStartedPulling="2026-03-18 09:56:00.988933458 +0000 UTC m=+3227.563678328" lastFinishedPulling="2026-03-18 09:56:02.699054854 +0000 UTC m=+3229.273799694" observedRunningTime="2026-03-18 09:56:03.265910816 +0000 UTC m=+3229.840655676" watchObservedRunningTime="2026-03-18 09:56:03.274250342 +0000 UTC m=+3229.848995192" Mar 18 09:56:04 crc kubenswrapper[4778]: I0318 09:56:04.255338 4778 generic.go:334] "Generic (PLEG): container finished" podID="952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" containerID="e43c5819f1ce670a94fb282c0d10a53cccd0c70528dde0f82e60b4168a0b1dd9" exitCode=0 Mar 18 09:56:04 crc kubenswrapper[4778]: I0318 09:56:04.255386 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" event={"ID":"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7","Type":"ContainerDied","Data":"e43c5819f1ce670a94fb282c0d10a53cccd0c70528dde0f82e60b4168a0b1dd9"} Mar 18 09:56:05 crc kubenswrapper[4778]: I0318 09:56:05.700890 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:05 crc kubenswrapper[4778]: I0318 09:56:05.771917 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v2p9\" (UniqueName: \"kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9\") pod \"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7\" (UID: \"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7\") " Mar 18 09:56:05 crc kubenswrapper[4778]: I0318 09:56:05.780891 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9" (OuterVolumeSpecName: "kube-api-access-2v2p9") pod "952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" (UID: "952d8866-a2f9-46d0-aa7b-5e578dd2f3c7"). InnerVolumeSpecName "kube-api-access-2v2p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:56:05 crc kubenswrapper[4778]: I0318 09:56:05.875091 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v2p9\" (UniqueName: \"kubernetes.io/projected/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7-kube-api-access-2v2p9\") on node \"crc\" DevicePath \"\"" Mar 18 09:56:06 crc kubenswrapper[4778]: I0318 09:56:06.273388 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" event={"ID":"952d8866-a2f9-46d0-aa7b-5e578dd2f3c7","Type":"ContainerDied","Data":"6045c1a6752ab8c67ab2035fa4d5ea43d54f98f6650cf09fa319ce731eca27f7"} Mar 18 09:56:06 crc kubenswrapper[4778]: I0318 09:56:06.273427 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6045c1a6752ab8c67ab2035fa4d5ea43d54f98f6650cf09fa319ce731eca27f7" Mar 18 09:56:06 crc kubenswrapper[4778]: I0318 09:56:06.273497 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563796-r7bfh" Mar 18 09:56:06 crc kubenswrapper[4778]: I0318 09:56:06.333060 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563790-nsf4n"] Mar 18 09:56:06 crc kubenswrapper[4778]: I0318 09:56:06.341349 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563790-nsf4n"] Mar 18 09:56:08 crc kubenswrapper[4778]: I0318 09:56:08.199049 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08fbf495-18e2-4d61-ad96-1bf74db07f0e" path="/var/lib/kubelet/pods/08fbf495-18e2-4d61-ad96-1bf74db07f0e/volumes" Mar 18 09:56:08 crc kubenswrapper[4778]: I0318 09:56:08.844641 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"757e3758-d646-4267-8c4c-b5efb0dcf709","Type":"ContainerStarted","Data":"c559ae3a1e4423e99c37d72f15f18f3cd16bc2838d62270df411dbac2afa6c1e"} Mar 18 09:56:08 crc kubenswrapper[4778]: I0318 09:56:08.871833 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s00-full" podStartSLOduration=4.457845329 podStartE2EDuration="45.871815798s" podCreationTimestamp="2026-03-18 09:55:23 +0000 UTC" firstStartedPulling="2026-03-18 09:55:25.230622091 +0000 UTC m=+3191.805366931" lastFinishedPulling="2026-03-18 09:56:06.64459256 +0000 UTC m=+3233.219337400" observedRunningTime="2026-03-18 09:56:08.863320918 +0000 UTC m=+3235.438065788" watchObservedRunningTime="2026-03-18 09:56:08.871815798 +0000 UTC m=+3235.446560638" Mar 18 09:56:14 crc kubenswrapper[4778]: I0318 09:56:14.199220 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:56:14 crc kubenswrapper[4778]: E0318 09:56:14.199963 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:56:25 crc kubenswrapper[4778]: I0318 09:56:25.187764 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:56:25 crc kubenswrapper[4778]: E0318 09:56:25.188780 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:56:33 crc kubenswrapper[4778]: I0318 09:56:33.224682 4778 scope.go:117] "RemoveContainer" containerID="58ef47c1a33dc103d35c1381547dc4531f738d5df6648d3b82a9b2e034b9599e" Mar 18 09:56:40 crc kubenswrapper[4778]: I0318 09:56:40.187124 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:56:40 crc kubenswrapper[4778]: E0318 09:56:40.188492 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:56:53 crc kubenswrapper[4778]: I0318 09:56:53.187739 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:56:53 crc kubenswrapper[4778]: E0318 09:56:53.189694 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:57:04 crc kubenswrapper[4778]: I0318 09:57:04.196763 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:57:04 crc kubenswrapper[4778]: E0318 09:57:04.197640 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:57:19 crc kubenswrapper[4778]: I0318 09:57:19.188227 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:57:19 crc kubenswrapper[4778]: E0318 09:57:19.189549 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:57:30 crc kubenswrapper[4778]: I0318 09:57:30.187783 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:57:30 crc kubenswrapper[4778]: E0318 09:57:30.189331 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:57:44 crc kubenswrapper[4778]: I0318 09:57:44.192754 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:57:44 crc kubenswrapper[4778]: E0318 09:57:44.193539 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:57:58 crc kubenswrapper[4778]: I0318 09:57:58.187843 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:57:58 crc kubenswrapper[4778]: E0318 09:57:58.188804 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.178618 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563798-7x8l8"] Mar 18 09:58:00 crc kubenswrapper[4778]: E0318 09:58:00.179994 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" containerName="oc" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.180010 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" containerName="oc" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.180256 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" containerName="oc" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.181269 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.183879 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.183898 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.185473 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.205176 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563798-7x8l8"] Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.276806 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smjz9\" (UniqueName: \"kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9\") pod \"auto-csr-approver-29563798-7x8l8\" (UID: \"18a30920-760a-4dd3-ac4a-63b9add62521\") " pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.378538 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smjz9\" (UniqueName: \"kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9\") pod \"auto-csr-approver-29563798-7x8l8\" (UID: \"18a30920-760a-4dd3-ac4a-63b9add62521\") " pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.397405 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smjz9\" (UniqueName: \"kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9\") pod \"auto-csr-approver-29563798-7x8l8\" (UID: \"18a30920-760a-4dd3-ac4a-63b9add62521\") " pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.501305 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:00 crc kubenswrapper[4778]: I0318 09:58:00.959868 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563798-7x8l8"] Mar 18 09:58:01 crc kubenswrapper[4778]: I0318 09:58:01.507421 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" event={"ID":"18a30920-760a-4dd3-ac4a-63b9add62521","Type":"ContainerStarted","Data":"472e677611501d03d17155c0638f3802c0bde32da5313b520bbd9299971c8985"} Mar 18 09:58:02 crc kubenswrapper[4778]: I0318 09:58:02.517794 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" event={"ID":"18a30920-760a-4dd3-ac4a-63b9add62521","Type":"ContainerStarted","Data":"d0bc455b5828f2dc8d018076f6b57f0e3f54f0daa7e1a53584affe8f4dab5285"} Mar 18 09:58:02 crc kubenswrapper[4778]: I0318 09:58:02.542908 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" podStartSLOduration=1.582905706 podStartE2EDuration="2.542888639s" podCreationTimestamp="2026-03-18 09:58:00 +0000 UTC" firstStartedPulling="2026-03-18 09:58:00.963327344 +0000 UTC m=+3347.538072184" lastFinishedPulling="2026-03-18 09:58:01.923310277 +0000 UTC m=+3348.498055117" observedRunningTime="2026-03-18 09:58:02.529352412 +0000 UTC m=+3349.104097272" watchObservedRunningTime="2026-03-18 09:58:02.542888639 +0000 UTC m=+3349.117633479" Mar 18 09:58:03 crc kubenswrapper[4778]: I0318 09:58:03.528692 4778 generic.go:334] "Generic (PLEG): container finished" podID="18a30920-760a-4dd3-ac4a-63b9add62521" containerID="d0bc455b5828f2dc8d018076f6b57f0e3f54f0daa7e1a53584affe8f4dab5285" exitCode=0 Mar 18 09:58:03 crc kubenswrapper[4778]: I0318 09:58:03.528722 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" event={"ID":"18a30920-760a-4dd3-ac4a-63b9add62521","Type":"ContainerDied","Data":"d0bc455b5828f2dc8d018076f6b57f0e3f54f0daa7e1a53584affe8f4dab5285"} Mar 18 09:58:04 crc kubenswrapper[4778]: I0318 09:58:04.917962 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.081348 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smjz9\" (UniqueName: \"kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9\") pod \"18a30920-760a-4dd3-ac4a-63b9add62521\" (UID: \"18a30920-760a-4dd3-ac4a-63b9add62521\") " Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.086882 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9" (OuterVolumeSpecName: "kube-api-access-smjz9") pod "18a30920-760a-4dd3-ac4a-63b9add62521" (UID: "18a30920-760a-4dd3-ac4a-63b9add62521"). InnerVolumeSpecName "kube-api-access-smjz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.184077 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smjz9\" (UniqueName: \"kubernetes.io/projected/18a30920-760a-4dd3-ac4a-63b9add62521-kube-api-access-smjz9\") on node \"crc\" DevicePath \"\"" Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.548218 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" event={"ID":"18a30920-760a-4dd3-ac4a-63b9add62521","Type":"ContainerDied","Data":"472e677611501d03d17155c0638f3802c0bde32da5313b520bbd9299971c8985"} Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.548746 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="472e677611501d03d17155c0638f3802c0bde32da5313b520bbd9299971c8985" Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.548289 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563798-7x8l8" Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.626306 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563792-4g4zq"] Mar 18 09:58:05 crc kubenswrapper[4778]: I0318 09:58:05.636593 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563792-4g4zq"] Mar 18 09:58:06 crc kubenswrapper[4778]: I0318 09:58:06.199923 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7095f92-8336-4c69-9c71-c3b9aa45bb82" path="/var/lib/kubelet/pods/a7095f92-8336-4c69-9c71-c3b9aa45bb82/volumes" Mar 18 09:58:11 crc kubenswrapper[4778]: I0318 09:58:11.187483 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 09:58:11 crc kubenswrapper[4778]: I0318 09:58:11.755327 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07"} Mar 18 09:58:33 crc kubenswrapper[4778]: I0318 09:58:33.369741 4778 scope.go:117] "RemoveContainer" containerID="129d18099eafc9ec58cca914d6f8f45f3f345a43be2618db7b6619ab09177632" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.582300 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:08 crc kubenswrapper[4778]: E0318 09:59:08.583704 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a30920-760a-4dd3-ac4a-63b9add62521" containerName="oc" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.583730 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a30920-760a-4dd3-ac4a-63b9add62521" containerName="oc" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.584122 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a30920-760a-4dd3-ac4a-63b9add62521" containerName="oc" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.586859 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.704378 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.704479 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mp47\" (UniqueName: \"kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.704558 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.806374 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.806434 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mp47\" (UniqueName: \"kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.806464 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.807174 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.807321 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.827354 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mp47\" (UniqueName: \"kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47\") pod \"community-operators-h2w8w\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.913509 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:08 crc kubenswrapper[4778]: I0318 09:59:08.927705 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:09 crc kubenswrapper[4778]: I0318 09:59:09.511865 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:10 crc kubenswrapper[4778]: I0318 09:59:10.334315 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerID="1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839" exitCode=0 Mar 18 09:59:10 crc kubenswrapper[4778]: I0318 09:59:10.334402 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerDied","Data":"1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839"} Mar 18 09:59:10 crc kubenswrapper[4778]: I0318 09:59:10.334608 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerStarted","Data":"77604ef018c62b9319353e20adb2f928cd62a0a17ae4608bb5ff128a91eb2c37"} Mar 18 09:59:10 crc kubenswrapper[4778]: I0318 09:59:10.338442 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:59:11 crc kubenswrapper[4778]: I0318 09:59:11.344210 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerStarted","Data":"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe"} Mar 18 09:59:13 crc kubenswrapper[4778]: I0318 09:59:13.361648 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerID="b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe" exitCode=0 Mar 18 09:59:13 crc kubenswrapper[4778]: I0318 09:59:13.361952 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerDied","Data":"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe"} Mar 18 09:59:14 crc kubenswrapper[4778]: I0318 09:59:14.372041 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerStarted","Data":"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363"} Mar 18 09:59:14 crc kubenswrapper[4778]: I0318 09:59:14.400479 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h2w8w" podStartSLOduration=2.7813531190000003 podStartE2EDuration="6.400459343s" podCreationTimestamp="2026-03-18 09:59:08 +0000 UTC" firstStartedPulling="2026-03-18 09:59:10.338179281 +0000 UTC m=+3416.912924111" lastFinishedPulling="2026-03-18 09:59:13.957285495 +0000 UTC m=+3420.532030335" observedRunningTime="2026-03-18 09:59:14.391371207 +0000 UTC m=+3420.966116047" watchObservedRunningTime="2026-03-18 09:59:14.400459343 +0000 UTC m=+3420.975204193" Mar 18 09:59:18 crc kubenswrapper[4778]: I0318 09:59:18.929127 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:18 crc kubenswrapper[4778]: I0318 09:59:18.929808 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:18 crc kubenswrapper[4778]: I0318 09:59:18.988797 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:19 crc kubenswrapper[4778]: I0318 09:59:19.458817 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:21 crc kubenswrapper[4778]: I0318 09:59:21.316874 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:21 crc kubenswrapper[4778]: I0318 09:59:21.433158 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h2w8w" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="registry-server" containerID="cri-o://26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363" gracePeriod=2 Mar 18 09:59:21 crc kubenswrapper[4778]: I0318 09:59:21.944595 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.059494 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities\") pod \"0cf93b82-72dd-4fae-976b-bec6edb2e920\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.059601 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mp47\" (UniqueName: \"kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47\") pod \"0cf93b82-72dd-4fae-976b-bec6edb2e920\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.059646 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content\") pod \"0cf93b82-72dd-4fae-976b-bec6edb2e920\" (UID: \"0cf93b82-72dd-4fae-976b-bec6edb2e920\") " Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.060555 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities" (OuterVolumeSpecName: "utilities") pod "0cf93b82-72dd-4fae-976b-bec6edb2e920" (UID: "0cf93b82-72dd-4fae-976b-bec6edb2e920"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.066184 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47" (OuterVolumeSpecName: "kube-api-access-5mp47") pod "0cf93b82-72dd-4fae-976b-bec6edb2e920" (UID: "0cf93b82-72dd-4fae-976b-bec6edb2e920"). InnerVolumeSpecName "kube-api-access-5mp47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.124835 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cf93b82-72dd-4fae-976b-bec6edb2e920" (UID: "0cf93b82-72dd-4fae-976b-bec6edb2e920"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.161682 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.161711 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mp47\" (UniqueName: \"kubernetes.io/projected/0cf93b82-72dd-4fae-976b-bec6edb2e920-kube-api-access-5mp47\") on node \"crc\" DevicePath \"\"" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.161723 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cf93b82-72dd-4fae-976b-bec6edb2e920-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.447069 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerID="26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363" exitCode=0 Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.447123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerDied","Data":"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363"} Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.447163 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2w8w" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.447217 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2w8w" event={"ID":"0cf93b82-72dd-4fae-976b-bec6edb2e920","Type":"ContainerDied","Data":"77604ef018c62b9319353e20adb2f928cd62a0a17ae4608bb5ff128a91eb2c37"} Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.447249 4778 scope.go:117] "RemoveContainer" containerID="26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.485715 4778 scope.go:117] "RemoveContainer" containerID="b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.500910 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.510580 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h2w8w"] Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.512928 4778 scope.go:117] "RemoveContainer" containerID="1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.574177 4778 scope.go:117] "RemoveContainer" containerID="26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363" Mar 18 09:59:22 crc kubenswrapper[4778]: E0318 09:59:22.574761 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363\": container with ID starting with 26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363 not found: ID does not exist" containerID="26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.574823 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363"} err="failed to get container status \"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363\": rpc error: code = NotFound desc = could not find container \"26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363\": container with ID starting with 26697002c77bac2e10e5502853fe957ecaf9ba961b13b3205faf286fff390363 not found: ID does not exist" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.574857 4778 scope.go:117] "RemoveContainer" containerID="b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe" Mar 18 09:59:22 crc kubenswrapper[4778]: E0318 09:59:22.575402 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe\": container with ID starting with b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe not found: ID does not exist" containerID="b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.575444 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe"} err="failed to get container status \"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe\": rpc error: code = NotFound desc = could not find container \"b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe\": container with ID starting with b80e9a111a59b403b49cb5d33424113c3e31207583defe68f62b4acd09db83fe not found: ID does not exist" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.575474 4778 scope.go:117] "RemoveContainer" containerID="1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839" Mar 18 09:59:22 crc kubenswrapper[4778]: E0318 09:59:22.575776 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839\": container with ID starting with 1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839 not found: ID does not exist" containerID="1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839" Mar 18 09:59:22 crc kubenswrapper[4778]: I0318 09:59:22.575805 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839"} err="failed to get container status \"1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839\": rpc error: code = NotFound desc = could not find container \"1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839\": container with ID starting with 1526a046accdfa540da1fc67f2251d5eaafa8dba8b8499a3a87d92eac748b839 not found: ID does not exist" Mar 18 09:59:24 crc kubenswrapper[4778]: I0318 09:59:24.201938 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" path="/var/lib/kubelet/pods/0cf93b82-72dd-4fae-976b-bec6edb2e920/volumes" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.149279 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563800-8grkw"] Mar 18 10:00:00 crc kubenswrapper[4778]: E0318 10:00:00.150426 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="registry-server" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.150446 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="registry-server" Mar 18 10:00:00 crc kubenswrapper[4778]: E0318 10:00:00.150464 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="extract-content" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.150471 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="extract-content" Mar 18 10:00:00 crc kubenswrapper[4778]: E0318 10:00:00.150495 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="extract-utilities" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.150503 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="extract-utilities" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.150744 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cf93b82-72dd-4fae-976b-bec6edb2e920" containerName="registry-server" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.151585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.155456 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.155719 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.156733 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.159507 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl"] Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.161156 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.163146 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.165910 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.182248 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563800-8grkw"] Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.247401 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl"] Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.254154 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.254356 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4hks\" (UniqueName: \"kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.254431 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbp4c\" (UniqueName: \"kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c\") pod \"auto-csr-approver-29563800-8grkw\" (UID: \"b7196caa-da0c-4933-b2d0-81c472bed9a9\") " pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.254472 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.357075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.357301 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4hks\" (UniqueName: \"kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.358064 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.358252 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbp4c\" (UniqueName: \"kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c\") pod \"auto-csr-approver-29563800-8grkw\" (UID: \"b7196caa-da0c-4933-b2d0-81c472bed9a9\") " pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.358336 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.372935 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.378967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbp4c\" (UniqueName: \"kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c\") pod \"auto-csr-approver-29563800-8grkw\" (UID: \"b7196caa-da0c-4933-b2d0-81c472bed9a9\") " pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.382683 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4hks\" (UniqueName: \"kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks\") pod \"collect-profiles-29563800-2v8nl\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.498603 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:00 crc kubenswrapper[4778]: I0318 10:00:00.503884 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.205993 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl"] Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.216471 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563800-8grkw"] Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.875877 4778 generic.go:334] "Generic (PLEG): container finished" podID="ca9f1133-0fec-4eeb-8b9b-39148a035a92" containerID="20eba30be4d8526eb64b11cc9e3c58803630e3554035c19c9650d8cecb2ebf82" exitCode=0 Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.876033 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" event={"ID":"ca9f1133-0fec-4eeb-8b9b-39148a035a92","Type":"ContainerDied","Data":"20eba30be4d8526eb64b11cc9e3c58803630e3554035c19c9650d8cecb2ebf82"} Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.876179 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" event={"ID":"ca9f1133-0fec-4eeb-8b9b-39148a035a92","Type":"ContainerStarted","Data":"14a0de78c81ed2b08ab92fa03668d411cbfd11c81944dba852d6d77cb068f7d4"} Mar 18 10:00:01 crc kubenswrapper[4778]: I0318 10:00:01.877887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563800-8grkw" event={"ID":"b7196caa-da0c-4933-b2d0-81c472bed9a9","Type":"ContainerStarted","Data":"3294134c32825ebc6339fed6c44cda6f1e69b2d7521a0e5af1bdc7fae18305a0"} Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.266306 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.433399 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4hks\" (UniqueName: \"kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks\") pod \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.433618 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume\") pod \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.433760 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume\") pod \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\" (UID: \"ca9f1133-0fec-4eeb-8b9b-39148a035a92\") " Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.434753 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume" (OuterVolumeSpecName: "config-volume") pod "ca9f1133-0fec-4eeb-8b9b-39148a035a92" (UID: "ca9f1133-0fec-4eeb-8b9b-39148a035a92"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.439630 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ca9f1133-0fec-4eeb-8b9b-39148a035a92" (UID: "ca9f1133-0fec-4eeb-8b9b-39148a035a92"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.441011 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks" (OuterVolumeSpecName: "kube-api-access-m4hks") pod "ca9f1133-0fec-4eeb-8b9b-39148a035a92" (UID: "ca9f1133-0fec-4eeb-8b9b-39148a035a92"). InnerVolumeSpecName "kube-api-access-m4hks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.536576 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca9f1133-0fec-4eeb-8b9b-39148a035a92-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.536624 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca9f1133-0fec-4eeb-8b9b-39148a035a92-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.536634 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4hks\" (UniqueName: \"kubernetes.io/projected/ca9f1133-0fec-4eeb-8b9b-39148a035a92-kube-api-access-m4hks\") on node \"crc\" DevicePath \"\"" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.893321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" event={"ID":"ca9f1133-0fec-4eeb-8b9b-39148a035a92","Type":"ContainerDied","Data":"14a0de78c81ed2b08ab92fa03668d411cbfd11c81944dba852d6d77cb068f7d4"} Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.893353 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14a0de78c81ed2b08ab92fa03668d411cbfd11c81944dba852d6d77cb068f7d4" Mar 18 10:00:03 crc kubenswrapper[4778]: I0318 10:00:03.893372 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl" Mar 18 10:00:04 crc kubenswrapper[4778]: I0318 10:00:04.357576 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6"] Mar 18 10:00:04 crc kubenswrapper[4778]: I0318 10:00:04.373100 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-zdrp6"] Mar 18 10:00:06 crc kubenswrapper[4778]: I0318 10:00:06.201379 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea72845-4b27-4381-b08b-e0570c67bddb" path="/var/lib/kubelet/pods/bea72845-4b27-4381-b08b-e0570c67bddb/volumes" Mar 18 10:00:21 crc kubenswrapper[4778]: I0318 10:00:21.045110 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563800-8grkw" event={"ID":"b7196caa-da0c-4933-b2d0-81c472bed9a9","Type":"ContainerStarted","Data":"52b2bf061001e2a8dfe4b355dbc94585c1d208b337020ae42eb7ee2f487a7b0c"} Mar 18 10:00:21 crc kubenswrapper[4778]: I0318 10:00:21.070532 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563800-8grkw" podStartSLOduration=1.716889178 podStartE2EDuration="21.070508635s" podCreationTimestamp="2026-03-18 10:00:00 +0000 UTC" firstStartedPulling="2026-03-18 10:00:01.211945514 +0000 UTC m=+3467.786690354" lastFinishedPulling="2026-03-18 10:00:20.565564971 +0000 UTC m=+3487.140309811" observedRunningTime="2026-03-18 10:00:21.058443137 +0000 UTC m=+3487.633188007" watchObservedRunningTime="2026-03-18 10:00:21.070508635 +0000 UTC m=+3487.645253475" Mar 18 10:00:22 crc kubenswrapper[4778]: I0318 10:00:22.055725 4778 generic.go:334] "Generic (PLEG): container finished" podID="b7196caa-da0c-4933-b2d0-81c472bed9a9" containerID="52b2bf061001e2a8dfe4b355dbc94585c1d208b337020ae42eb7ee2f487a7b0c" exitCode=0 Mar 18 10:00:22 crc kubenswrapper[4778]: I0318 10:00:22.056217 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563800-8grkw" event={"ID":"b7196caa-da0c-4933-b2d0-81c472bed9a9","Type":"ContainerDied","Data":"52b2bf061001e2a8dfe4b355dbc94585c1d208b337020ae42eb7ee2f487a7b0c"} Mar 18 10:00:23 crc kubenswrapper[4778]: I0318 10:00:23.471145 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:23 crc kubenswrapper[4778]: I0318 10:00:23.534056 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbp4c\" (UniqueName: \"kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c\") pod \"b7196caa-da0c-4933-b2d0-81c472bed9a9\" (UID: \"b7196caa-da0c-4933-b2d0-81c472bed9a9\") " Mar 18 10:00:23 crc kubenswrapper[4778]: I0318 10:00:23.541618 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c" (OuterVolumeSpecName: "kube-api-access-hbp4c") pod "b7196caa-da0c-4933-b2d0-81c472bed9a9" (UID: "b7196caa-da0c-4933-b2d0-81c472bed9a9"). InnerVolumeSpecName "kube-api-access-hbp4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:00:23 crc kubenswrapper[4778]: I0318 10:00:23.635774 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbp4c\" (UniqueName: \"kubernetes.io/projected/b7196caa-da0c-4933-b2d0-81c472bed9a9-kube-api-access-hbp4c\") on node \"crc\" DevicePath \"\"" Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.085394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563800-8grkw" event={"ID":"b7196caa-da0c-4933-b2d0-81c472bed9a9","Type":"ContainerDied","Data":"3294134c32825ebc6339fed6c44cda6f1e69b2d7521a0e5af1bdc7fae18305a0"} Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.085773 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3294134c32825ebc6339fed6c44cda6f1e69b2d7521a0e5af1bdc7fae18305a0" Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.085481 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563800-8grkw" Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.166321 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563794-586xm"] Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.173999 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563794-586xm"] Mar 18 10:00:24 crc kubenswrapper[4778]: I0318 10:00:24.199799 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c681ab9e-5bfe-4e10-9154-41ff0c5d76a3" path="/var/lib/kubelet/pods/c681ab9e-5bfe-4e10-9154-41ff0c5d76a3/volumes" Mar 18 10:00:30 crc kubenswrapper[4778]: I0318 10:00:30.147764 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:00:30 crc kubenswrapper[4778]: I0318 10:00:30.148347 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:00:33 crc kubenswrapper[4778]: I0318 10:00:33.498236 4778 scope.go:117] "RemoveContainer" containerID="1887b38177a6dc3b69f09e9dc6a6dd26a61cf63c5f532cbb5b0e04e1fb5a3d8b" Mar 18 10:00:33 crc kubenswrapper[4778]: I0318 10:00:33.562890 4778 scope.go:117] "RemoveContainer" containerID="eaf004108fc735124a6750b445bf1e3f7676efb1a3da3a71036d9a0909c64710" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.147539 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.148080 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.148096 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29563801-nctwn"] Mar 18 10:01:00 crc kubenswrapper[4778]: E0318 10:01:00.150577 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9f1133-0fec-4eeb-8b9b-39148a035a92" containerName="collect-profiles" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.150607 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9f1133-0fec-4eeb-8b9b-39148a035a92" containerName="collect-profiles" Mar 18 10:01:00 crc kubenswrapper[4778]: E0318 10:01:00.150639 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7196caa-da0c-4933-b2d0-81c472bed9a9" containerName="oc" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.150649 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7196caa-da0c-4933-b2d0-81c472bed9a9" containerName="oc" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.150956 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7196caa-da0c-4933-b2d0-81c472bed9a9" containerName="oc" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.150983 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9f1133-0fec-4eeb-8b9b-39148a035a92" containerName="collect-profiles" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.151802 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.181788 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563801-nctwn"] Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.226794 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.227167 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.227349 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsszw\" (UniqueName: \"kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.227594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.330101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.330222 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsszw\" (UniqueName: \"kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.330319 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.330408 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.336543 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.336733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.339055 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.347866 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsszw\" (UniqueName: \"kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw\") pod \"keystone-cron-29563801-nctwn\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.475236 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:00 crc kubenswrapper[4778]: I0318 10:01:00.930256 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563801-nctwn"] Mar 18 10:01:01 crc kubenswrapper[4778]: I0318 10:01:01.414894 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563801-nctwn" event={"ID":"8ace9f11-f4d8-4801-afa2-5b723d52d41e","Type":"ContainerStarted","Data":"28d152bac9f17e0efab4925b14ece7afdb8366b297f19851fea386af5ff7041d"} Mar 18 10:01:01 crc kubenswrapper[4778]: I0318 10:01:01.415277 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563801-nctwn" event={"ID":"8ace9f11-f4d8-4801-afa2-5b723d52d41e","Type":"ContainerStarted","Data":"6861f7ab8ee5902e26632016dd16afe79b26f8e149799abe01220689f2fa3927"} Mar 18 10:01:01 crc kubenswrapper[4778]: I0318 10:01:01.433732 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29563801-nctwn" podStartSLOduration=1.433713488 podStartE2EDuration="1.433713488s" podCreationTimestamp="2026-03-18 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:01:01.429804193 +0000 UTC m=+3528.004549053" watchObservedRunningTime="2026-03-18 10:01:01.433713488 +0000 UTC m=+3528.008458328" Mar 18 10:01:04 crc kubenswrapper[4778]: I0318 10:01:04.440854 4778 generic.go:334] "Generic (PLEG): container finished" podID="8ace9f11-f4d8-4801-afa2-5b723d52d41e" containerID="28d152bac9f17e0efab4925b14ece7afdb8366b297f19851fea386af5ff7041d" exitCode=0 Mar 18 10:01:04 crc kubenswrapper[4778]: I0318 10:01:04.440926 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563801-nctwn" event={"ID":"8ace9f11-f4d8-4801-afa2-5b723d52d41e","Type":"ContainerDied","Data":"28d152bac9f17e0efab4925b14ece7afdb8366b297f19851fea386af5ff7041d"} Mar 18 10:01:05 crc kubenswrapper[4778]: I0318 10:01:05.933806 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.244354 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys\") pod \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.244471 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") pod \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.244560 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsszw\" (UniqueName: \"kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw\") pod \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.244733 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle\") pod \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.253654 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8ace9f11-f4d8-4801-afa2-5b723d52d41e" (UID: "8ace9f11-f4d8-4801-afa2-5b723d52d41e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.258548 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw" (OuterVolumeSpecName: "kube-api-access-bsszw") pod "8ace9f11-f4d8-4801-afa2-5b723d52d41e" (UID: "8ace9f11-f4d8-4801-afa2-5b723d52d41e"). InnerVolumeSpecName "kube-api-access-bsszw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.330609 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ace9f11-f4d8-4801-afa2-5b723d52d41e" (UID: "8ace9f11-f4d8-4801-afa2-5b723d52d41e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.346499 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data" (OuterVolumeSpecName: "config-data") pod "8ace9f11-f4d8-4801-afa2-5b723d52d41e" (UID: "8ace9f11-f4d8-4801-afa2-5b723d52d41e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.346782 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") pod \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\" (UID: \"8ace9f11-f4d8-4801-afa2-5b723d52d41e\") " Mar 18 10:01:06 crc kubenswrapper[4778]: W0318 10:01:06.346915 4778 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8ace9f11-f4d8-4801-afa2-5b723d52d41e/volumes/kubernetes.io~secret/config-data Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.346937 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data" (OuterVolumeSpecName: "config-data") pod "8ace9f11-f4d8-4801-afa2-5b723d52d41e" (UID: "8ace9f11-f4d8-4801-afa2-5b723d52d41e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.347586 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.347619 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.347633 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ace9f11-f4d8-4801-afa2-5b723d52d41e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.347646 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsszw\" (UniqueName: \"kubernetes.io/projected/8ace9f11-f4d8-4801-afa2-5b723d52d41e-kube-api-access-bsszw\") on node \"crc\" DevicePath \"\"" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.456339 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563801-nctwn" event={"ID":"8ace9f11-f4d8-4801-afa2-5b723d52d41e","Type":"ContainerDied","Data":"6861f7ab8ee5902e26632016dd16afe79b26f8e149799abe01220689f2fa3927"} Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.456382 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6861f7ab8ee5902e26632016dd16afe79b26f8e149799abe01220689f2fa3927" Mar 18 10:01:06 crc kubenswrapper[4778]: I0318 10:01:06.456392 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563801-nctwn" Mar 18 10:01:06 crc kubenswrapper[4778]: E0318 10:01:06.640659 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ace9f11_f4d8_4801_afa2_5b723d52d41e.slice/crio-6861f7ab8ee5902e26632016dd16afe79b26f8e149799abe01220689f2fa3927\": RecentStats: unable to find data in memory cache]" Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.147253 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.147892 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.147943 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.149075 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.149134 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07" gracePeriod=600 Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.678144 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07" exitCode=0 Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.678186 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07"} Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.678915 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190"} Mar 18 10:01:30 crc kubenswrapper[4778]: I0318 10:01:30.678959 4778 scope.go:117] "RemoveContainer" containerID="f3eb3a562c8d07b936b8c887ab8ed99964c9691f8cc0d3592f72fbbd888ff3f3" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.151890 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563802-2blvs"] Mar 18 10:02:00 crc kubenswrapper[4778]: E0318 10:02:00.154096 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ace9f11-f4d8-4801-afa2-5b723d52d41e" containerName="keystone-cron" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.154241 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ace9f11-f4d8-4801-afa2-5b723d52d41e" containerName="keystone-cron" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.154655 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ace9f11-f4d8-4801-afa2-5b723d52d41e" containerName="keystone-cron" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.155649 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.158991 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.159020 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.161613 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563802-2blvs"] Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.164505 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.190967 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnb56\" (UniqueName: \"kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56\") pod \"auto-csr-approver-29563802-2blvs\" (UID: \"02a9f934-8e78-4c0c-b0cc-59cd49030b5c\") " pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.293157 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnb56\" (UniqueName: \"kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56\") pod \"auto-csr-approver-29563802-2blvs\" (UID: \"02a9f934-8e78-4c0c-b0cc-59cd49030b5c\") " pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.320113 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnb56\" (UniqueName: \"kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56\") pod \"auto-csr-approver-29563802-2blvs\" (UID: \"02a9f934-8e78-4c0c-b0cc-59cd49030b5c\") " pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.496066 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:00 crc kubenswrapper[4778]: I0318 10:02:00.985681 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563802-2blvs"] Mar 18 10:02:01 crc kubenswrapper[4778]: I0318 10:02:01.965489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563802-2blvs" event={"ID":"02a9f934-8e78-4c0c-b0cc-59cd49030b5c","Type":"ContainerStarted","Data":"475b2926bb7d3ae12ec8845273a247fb8f09834a4a363bd68973f3e3039f20d1"} Mar 18 10:02:02 crc kubenswrapper[4778]: I0318 10:02:02.977262 4778 generic.go:334] "Generic (PLEG): container finished" podID="02a9f934-8e78-4c0c-b0cc-59cd49030b5c" containerID="048eac64aca1190d343bcd6e5968c051bfb3de1baa3c94a83313a7e1b9b996de" exitCode=0 Mar 18 10:02:02 crc kubenswrapper[4778]: I0318 10:02:02.977345 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563802-2blvs" event={"ID":"02a9f934-8e78-4c0c-b0cc-59cd49030b5c","Type":"ContainerDied","Data":"048eac64aca1190d343bcd6e5968c051bfb3de1baa3c94a83313a7e1b9b996de"} Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.527624 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.676691 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnb56\" (UniqueName: \"kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56\") pod \"02a9f934-8e78-4c0c-b0cc-59cd49030b5c\" (UID: \"02a9f934-8e78-4c0c-b0cc-59cd49030b5c\") " Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.695444 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56" (OuterVolumeSpecName: "kube-api-access-vnb56") pod "02a9f934-8e78-4c0c-b0cc-59cd49030b5c" (UID: "02a9f934-8e78-4c0c-b0cc-59cd49030b5c"). InnerVolumeSpecName "kube-api-access-vnb56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.779470 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnb56\" (UniqueName: \"kubernetes.io/projected/02a9f934-8e78-4c0c-b0cc-59cd49030b5c-kube-api-access-vnb56\") on node \"crc\" DevicePath \"\"" Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.996492 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563802-2blvs" event={"ID":"02a9f934-8e78-4c0c-b0cc-59cd49030b5c","Type":"ContainerDied","Data":"475b2926bb7d3ae12ec8845273a247fb8f09834a4a363bd68973f3e3039f20d1"} Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.996548 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475b2926bb7d3ae12ec8845273a247fb8f09834a4a363bd68973f3e3039f20d1" Mar 18 10:02:04 crc kubenswrapper[4778]: I0318 10:02:04.996624 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563802-2blvs" Mar 18 10:02:05 crc kubenswrapper[4778]: I0318 10:02:05.601461 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563796-r7bfh"] Mar 18 10:02:05 crc kubenswrapper[4778]: I0318 10:02:05.610433 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563796-r7bfh"] Mar 18 10:02:06 crc kubenswrapper[4778]: I0318 10:02:06.198424 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="952d8866-a2f9-46d0-aa7b-5e578dd2f3c7" path="/var/lib/kubelet/pods/952d8866-a2f9-46d0-aa7b-5e578dd2f3c7/volumes" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.858822 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:30 crc kubenswrapper[4778]: E0318 10:02:30.868687 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a9f934-8e78-4c0c-b0cc-59cd49030b5c" containerName="oc" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.868946 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a9f934-8e78-4c0c-b0cc-59cd49030b5c" containerName="oc" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.869757 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a9f934-8e78-4c0c-b0cc-59cd49030b5c" containerName="oc" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.872912 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.902920 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.952066 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.952477 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:30 crc kubenswrapper[4778]: I0318 10:02:30.952617 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsvsd\" (UniqueName: \"kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.055072 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.055147 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsvsd\" (UniqueName: \"kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.055288 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.056224 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.056334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.075509 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsvsd\" (UniqueName: \"kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd\") pod \"redhat-marketplace-qb8tg\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.210177 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:31 crc kubenswrapper[4778]: I0318 10:02:31.728224 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:32 crc kubenswrapper[4778]: I0318 10:02:32.244690 4778 generic.go:334] "Generic (PLEG): container finished" podID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerID="c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13" exitCode=0 Mar 18 10:02:32 crc kubenswrapper[4778]: I0318 10:02:32.245057 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerDied","Data":"c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13"} Mar 18 10:02:32 crc kubenswrapper[4778]: I0318 10:02:32.245136 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerStarted","Data":"a55e38478b117edd30ba7fd52a6bbb758ea162a56775c1ebaade4a1d8fc531b7"} Mar 18 10:02:33 crc kubenswrapper[4778]: I0318 10:02:33.255141 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerStarted","Data":"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51"} Mar 18 10:02:33 crc kubenswrapper[4778]: I0318 10:02:33.655585 4778 scope.go:117] "RemoveContainer" containerID="e43c5819f1ce670a94fb282c0d10a53cccd0c70528dde0f82e60b4168a0b1dd9" Mar 18 10:02:34 crc kubenswrapper[4778]: I0318 10:02:34.266095 4778 generic.go:334] "Generic (PLEG): container finished" podID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerID="611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51" exitCode=0 Mar 18 10:02:34 crc kubenswrapper[4778]: I0318 10:02:34.266268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerDied","Data":"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51"} Mar 18 10:02:35 crc kubenswrapper[4778]: I0318 10:02:35.278228 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerStarted","Data":"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020"} Mar 18 10:02:35 crc kubenswrapper[4778]: I0318 10:02:35.308895 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qb8tg" podStartSLOduration=2.871759668 podStartE2EDuration="5.308867688s" podCreationTimestamp="2026-03-18 10:02:30 +0000 UTC" firstStartedPulling="2026-03-18 10:02:32.247843609 +0000 UTC m=+3618.822588449" lastFinishedPulling="2026-03-18 10:02:34.684951629 +0000 UTC m=+3621.259696469" observedRunningTime="2026-03-18 10:02:35.294473717 +0000 UTC m=+3621.869218557" watchObservedRunningTime="2026-03-18 10:02:35.308867688 +0000 UTC m=+3621.883612528" Mar 18 10:02:41 crc kubenswrapper[4778]: I0318 10:02:41.211256 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:41 crc kubenswrapper[4778]: I0318 10:02:41.212096 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:41 crc kubenswrapper[4778]: I0318 10:02:41.256965 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:41 crc kubenswrapper[4778]: I0318 10:02:41.377005 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:41 crc kubenswrapper[4778]: I0318 10:02:41.498433 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:43 crc kubenswrapper[4778]: I0318 10:02:43.347585 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qb8tg" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="registry-server" containerID="cri-o://db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020" gracePeriod=2 Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.036600 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.128937 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content\") pod \"0c2e606c-94aa-4c97-aef4-741fc7402bac\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.129045 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities\") pod \"0c2e606c-94aa-4c97-aef4-741fc7402bac\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.129181 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsvsd\" (UniqueName: \"kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd\") pod \"0c2e606c-94aa-4c97-aef4-741fc7402bac\" (UID: \"0c2e606c-94aa-4c97-aef4-741fc7402bac\") " Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.130736 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities" (OuterVolumeSpecName: "utilities") pod "0c2e606c-94aa-4c97-aef4-741fc7402bac" (UID: "0c2e606c-94aa-4c97-aef4-741fc7402bac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.136743 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd" (OuterVolumeSpecName: "kube-api-access-qsvsd") pod "0c2e606c-94aa-4c97-aef4-741fc7402bac" (UID: "0c2e606c-94aa-4c97-aef4-741fc7402bac"). InnerVolumeSpecName "kube-api-access-qsvsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.155884 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c2e606c-94aa-4c97-aef4-741fc7402bac" (UID: "0c2e606c-94aa-4c97-aef4-741fc7402bac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.234163 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsvsd\" (UniqueName: \"kubernetes.io/projected/0c2e606c-94aa-4c97-aef4-741fc7402bac-kube-api-access-qsvsd\") on node \"crc\" DevicePath \"\"" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.234527 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.234544 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2e606c-94aa-4c97-aef4-741fc7402bac-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.358279 4778 generic.go:334] "Generic (PLEG): container finished" podID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerID="db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020" exitCode=0 Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.358322 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerDied","Data":"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020"} Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.358352 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb8tg" event={"ID":"0c2e606c-94aa-4c97-aef4-741fc7402bac","Type":"ContainerDied","Data":"a55e38478b117edd30ba7fd52a6bbb758ea162a56775c1ebaade4a1d8fc531b7"} Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.358377 4778 scope.go:117] "RemoveContainer" containerID="db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.358371 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb8tg" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.383506 4778 scope.go:117] "RemoveContainer" containerID="611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.386363 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.399804 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb8tg"] Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.407040 4778 scope.go:117] "RemoveContainer" containerID="c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.458901 4778 scope.go:117] "RemoveContainer" containerID="db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020" Mar 18 10:02:44 crc kubenswrapper[4778]: E0318 10:02:44.459513 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020\": container with ID starting with db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020 not found: ID does not exist" containerID="db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.459566 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020"} err="failed to get container status \"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020\": rpc error: code = NotFound desc = could not find container \"db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020\": container with ID starting with db0e75076e51b97bd228f5848bfb10c96618700959bd6d9cfa574cea614a8020 not found: ID does not exist" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.459600 4778 scope.go:117] "RemoveContainer" containerID="611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51" Mar 18 10:02:44 crc kubenswrapper[4778]: E0318 10:02:44.460004 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51\": container with ID starting with 611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51 not found: ID does not exist" containerID="611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.460045 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51"} err="failed to get container status \"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51\": rpc error: code = NotFound desc = could not find container \"611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51\": container with ID starting with 611f05ef4423a106409e2c85d8c11e7467db2d5e6d2dc1b34f272f14b2be2e51 not found: ID does not exist" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.460073 4778 scope.go:117] "RemoveContainer" containerID="c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13" Mar 18 10:02:44 crc kubenswrapper[4778]: E0318 10:02:44.460432 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13\": container with ID starting with c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13 not found: ID does not exist" containerID="c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13" Mar 18 10:02:44 crc kubenswrapper[4778]: I0318 10:02:44.460472 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13"} err="failed to get container status \"c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13\": rpc error: code = NotFound desc = could not find container \"c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13\": container with ID starting with c216d84d647f197c8e2d0d69bb1f532f6b84bf1e59822a0452e55a0d50a41a13 not found: ID does not exist" Mar 18 10:02:46 crc kubenswrapper[4778]: I0318 10:02:46.199262 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" path="/var/lib/kubelet/pods/0c2e606c-94aa-4c97-aef4-741fc7402bac/volumes" Mar 18 10:03:19 crc kubenswrapper[4778]: I0318 10:03:19.037074 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-91af-account-create-update-cc4d5"] Mar 18 10:03:19 crc kubenswrapper[4778]: I0318 10:03:19.045828 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-j5mf6"] Mar 18 10:03:19 crc kubenswrapper[4778]: I0318 10:03:19.054596 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-91af-account-create-update-cc4d5"] Mar 18 10:03:19 crc kubenswrapper[4778]: I0318 10:03:19.063999 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-j5mf6"] Mar 18 10:03:20 crc kubenswrapper[4778]: I0318 10:03:20.196846 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57dd6190-5149-44a9-8a75-7e3d9077a43c" path="/var/lib/kubelet/pods/57dd6190-5149-44a9-8a75-7e3d9077a43c/volumes" Mar 18 10:03:20 crc kubenswrapper[4778]: I0318 10:03:20.198115 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9f651dd-ff4a-46c9-bd8c-0155be07f0a0" path="/var/lib/kubelet/pods/c9f651dd-ff4a-46c9-bd8c-0155be07f0a0/volumes" Mar 18 10:03:30 crc kubenswrapper[4778]: I0318 10:03:30.147698 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:03:30 crc kubenswrapper[4778]: I0318 10:03:30.148358 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:03:33 crc kubenswrapper[4778]: I0318 10:03:33.844834 4778 scope.go:117] "RemoveContainer" containerID="fc0fef996b4b9a5437de59c8b2ac8a5e7d95ba6ac33a74f54e7f79985c001e66" Mar 18 10:03:33 crc kubenswrapper[4778]: I0318 10:03:33.872496 4778 scope.go:117] "RemoveContainer" containerID="a03567ec27a1b447b64f31d017f135a530faf96727b9cdb25f25df3c2b11ab27" Mar 18 10:03:38 crc kubenswrapper[4778]: I0318 10:03:38.035319 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-fnlvs"] Mar 18 10:03:38 crc kubenswrapper[4778]: I0318 10:03:38.044361 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-fnlvs"] Mar 18 10:03:38 crc kubenswrapper[4778]: I0318 10:03:38.198253 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f9ef2c-6a05-438a-a701-92c9ef84d46d" path="/var/lib/kubelet/pods/86f9ef2c-6a05-438a-a701-92c9ef84d46d/volumes" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.147053 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563804-fx6ns"] Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.147477 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.147999 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:04:00 crc kubenswrapper[4778]: E0318 10:04:00.148299 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="registry-server" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.148318 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="registry-server" Mar 18 10:04:00 crc kubenswrapper[4778]: E0318 10:04:00.148332 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="extract-utilities" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.148341 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="extract-utilities" Mar 18 10:04:00 crc kubenswrapper[4778]: E0318 10:04:00.148357 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="extract-content" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.148367 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="extract-content" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.148620 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2e606c-94aa-4c97-aef4-741fc7402bac" containerName="registry-server" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.149508 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.151388 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.152538 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.152683 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.162828 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563804-fx6ns"] Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.248924 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-799vx\" (UniqueName: \"kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx\") pod \"auto-csr-approver-29563804-fx6ns\" (UID: \"5479124a-5b9d-403a-baf8-0e03ec15c707\") " pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.352999 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-799vx\" (UniqueName: \"kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx\") pod \"auto-csr-approver-29563804-fx6ns\" (UID: \"5479124a-5b9d-403a-baf8-0e03ec15c707\") " pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.381042 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-799vx\" (UniqueName: \"kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx\") pod \"auto-csr-approver-29563804-fx6ns\" (UID: \"5479124a-5b9d-403a-baf8-0e03ec15c707\") " pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.473607 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:00 crc kubenswrapper[4778]: I0318 10:04:00.988119 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563804-fx6ns"] Mar 18 10:04:01 crc kubenswrapper[4778]: I0318 10:04:01.042766 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" event={"ID":"5479124a-5b9d-403a-baf8-0e03ec15c707","Type":"ContainerStarted","Data":"53a513fbd91a8f5fce5776158592c3922ba4bd0582fe66a2b01c5a8c1dbc7219"} Mar 18 10:04:03 crc kubenswrapper[4778]: I0318 10:04:03.068994 4778 generic.go:334] "Generic (PLEG): container finished" podID="5479124a-5b9d-403a-baf8-0e03ec15c707" containerID="9989303583a93d583be4ddef72e12372e2db44e67cf888c7373ff47c4f5bfee9" exitCode=0 Mar 18 10:04:03 crc kubenswrapper[4778]: I0318 10:04:03.069071 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" event={"ID":"5479124a-5b9d-403a-baf8-0e03ec15c707","Type":"ContainerDied","Data":"9989303583a93d583be4ddef72e12372e2db44e67cf888c7373ff47c4f5bfee9"} Mar 18 10:04:04 crc kubenswrapper[4778]: I0318 10:04:04.645104 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:04 crc kubenswrapper[4778]: I0318 10:04:04.745019 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-799vx\" (UniqueName: \"kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx\") pod \"5479124a-5b9d-403a-baf8-0e03ec15c707\" (UID: \"5479124a-5b9d-403a-baf8-0e03ec15c707\") " Mar 18 10:04:04 crc kubenswrapper[4778]: I0318 10:04:04.751594 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx" (OuterVolumeSpecName: "kube-api-access-799vx") pod "5479124a-5b9d-403a-baf8-0e03ec15c707" (UID: "5479124a-5b9d-403a-baf8-0e03ec15c707"). InnerVolumeSpecName "kube-api-access-799vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:04:04 crc kubenswrapper[4778]: I0318 10:04:04.847896 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-799vx\" (UniqueName: \"kubernetes.io/projected/5479124a-5b9d-403a-baf8-0e03ec15c707-kube-api-access-799vx\") on node \"crc\" DevicePath \"\"" Mar 18 10:04:05 crc kubenswrapper[4778]: I0318 10:04:05.088670 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" event={"ID":"5479124a-5b9d-403a-baf8-0e03ec15c707","Type":"ContainerDied","Data":"53a513fbd91a8f5fce5776158592c3922ba4bd0582fe66a2b01c5a8c1dbc7219"} Mar 18 10:04:05 crc kubenswrapper[4778]: I0318 10:04:05.088981 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53a513fbd91a8f5fce5776158592c3922ba4bd0582fe66a2b01c5a8c1dbc7219" Mar 18 10:04:05 crc kubenswrapper[4778]: I0318 10:04:05.088705 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563804-fx6ns" Mar 18 10:04:05 crc kubenswrapper[4778]: I0318 10:04:05.712972 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563798-7x8l8"] Mar 18 10:04:05 crc kubenswrapper[4778]: I0318 10:04:05.728023 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563798-7x8l8"] Mar 18 10:04:06 crc kubenswrapper[4778]: I0318 10:04:06.197498 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a30920-760a-4dd3-ac4a-63b9add62521" path="/var/lib/kubelet/pods/18a30920-760a-4dd3-ac4a-63b9add62521/volumes" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.148064 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.148788 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.148855 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.150043 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.150150 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" gracePeriod=600 Mar 18 10:04:30 crc kubenswrapper[4778]: E0318 10:04:30.275019 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.306861 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" exitCode=0 Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.306924 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190"} Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.306980 4778 scope.go:117] "RemoveContainer" containerID="4a5832ad33a24fa897a57dad207c0db67df6b8bbaaa82af43426329f350e2e07" Mar 18 10:04:30 crc kubenswrapper[4778]: I0318 10:04:30.308279 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:04:30 crc kubenswrapper[4778]: E0318 10:04:30.308676 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:04:33 crc kubenswrapper[4778]: I0318 10:04:33.982563 4778 scope.go:117] "RemoveContainer" containerID="c09837191b6ba16c2c4c1ba6934469f4e373f1877eaa0a419ae86d94a526194d" Mar 18 10:04:34 crc kubenswrapper[4778]: I0318 10:04:34.017691 4778 scope.go:117] "RemoveContainer" containerID="d0bc455b5828f2dc8d018076f6b57f0e3f54f0daa7e1a53584affe8f4dab5285" Mar 18 10:04:41 crc kubenswrapper[4778]: I0318 10:04:41.186699 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:04:41 crc kubenswrapper[4778]: E0318 10:04:41.187677 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:04:52 crc kubenswrapper[4778]: I0318 10:04:52.187041 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:04:52 crc kubenswrapper[4778]: E0318 10:04:52.187898 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:04 crc kubenswrapper[4778]: I0318 10:05:04.194278 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:05:04 crc kubenswrapper[4778]: E0318 10:05:04.195573 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:17 crc kubenswrapper[4778]: I0318 10:05:17.188392 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:05:17 crc kubenswrapper[4778]: E0318 10:05:17.189137 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:32 crc kubenswrapper[4778]: I0318 10:05:32.187848 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:05:32 crc kubenswrapper[4778]: E0318 10:05:32.188950 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.042998 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:05:42 crc kubenswrapper[4778]: E0318 10:05:42.044567 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5479124a-5b9d-403a-baf8-0e03ec15c707" containerName="oc" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.044600 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5479124a-5b9d-403a-baf8-0e03ec15c707" containerName="oc" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.045067 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5479124a-5b9d-403a-baf8-0e03ec15c707" containerName="oc" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.047620 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.060148 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.191911 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qnh6\" (UniqueName: \"kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.191999 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.192067 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.293640 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qnh6\" (UniqueName: \"kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.293792 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.293876 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.296503 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.296884 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.322092 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qnh6\" (UniqueName: \"kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6\") pod \"redhat-operators-w6ds2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.391277 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:42 crc kubenswrapper[4778]: I0318 10:05:42.971146 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:05:43 crc kubenswrapper[4778]: I0318 10:05:43.187600 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:05:43 crc kubenswrapper[4778]: E0318 10:05:43.187928 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:43 crc kubenswrapper[4778]: I0318 10:05:43.936927 4778 generic.go:334] "Generic (PLEG): container finished" podID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerID="a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156" exitCode=0 Mar 18 10:05:43 crc kubenswrapper[4778]: I0318 10:05:43.937281 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerDied","Data":"a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156"} Mar 18 10:05:43 crc kubenswrapper[4778]: I0318 10:05:43.938241 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerStarted","Data":"ff1791fd6dc7166759becaa22e236a584f49d5a34e46f1ae3cd70fe242aa8182"} Mar 18 10:05:43 crc kubenswrapper[4778]: I0318 10:05:43.939816 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.225767 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.228549 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.254011 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.335679 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.336649 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.336756 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgqzh\" (UniqueName: \"kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.438047 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.438106 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgqzh\" (UniqueName: \"kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.438254 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.438699 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.438919 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.460642 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgqzh\" (UniqueName: \"kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh\") pod \"certified-operators-k9mc4\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:44 crc kubenswrapper[4778]: I0318 10:05:44.553542 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:45 crc kubenswrapper[4778]: I0318 10:05:45.154055 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:45 crc kubenswrapper[4778]: I0318 10:05:45.972384 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerStarted","Data":"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942"} Mar 18 10:05:45 crc kubenswrapper[4778]: I0318 10:05:45.974725 4778 generic.go:334] "Generic (PLEG): container finished" podID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerID="71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db" exitCode=0 Mar 18 10:05:45 crc kubenswrapper[4778]: I0318 10:05:45.974770 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerDied","Data":"71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db"} Mar 18 10:05:45 crc kubenswrapper[4778]: I0318 10:05:45.974801 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerStarted","Data":"76f017bfaa7bd549bf50c3e81af429cb8139358679676c8625b5bb5e1d24e86b"} Mar 18 10:05:47 crc kubenswrapper[4778]: I0318 10:05:47.997618 4778 generic.go:334] "Generic (PLEG): container finished" podID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerID="025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942" exitCode=0 Mar 18 10:05:47 crc kubenswrapper[4778]: I0318 10:05:47.998010 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerDied","Data":"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942"} Mar 18 10:05:48 crc kubenswrapper[4778]: I0318 10:05:48.007156 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerStarted","Data":"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1"} Mar 18 10:05:49 crc kubenswrapper[4778]: I0318 10:05:49.018727 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerStarted","Data":"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792"} Mar 18 10:05:49 crc kubenswrapper[4778]: I0318 10:05:49.021564 4778 generic.go:334] "Generic (PLEG): container finished" podID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerID="4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1" exitCode=0 Mar 18 10:05:49 crc kubenswrapper[4778]: I0318 10:05:49.021653 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerDied","Data":"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1"} Mar 18 10:05:49 crc kubenswrapper[4778]: I0318 10:05:49.050283 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w6ds2" podStartSLOduration=2.376242895 podStartE2EDuration="7.050265869s" podCreationTimestamp="2026-03-18 10:05:42 +0000 UTC" firstStartedPulling="2026-03-18 10:05:43.939532672 +0000 UTC m=+3810.514277512" lastFinishedPulling="2026-03-18 10:05:48.613555646 +0000 UTC m=+3815.188300486" observedRunningTime="2026-03-18 10:05:49.049646672 +0000 UTC m=+3815.624391532" watchObservedRunningTime="2026-03-18 10:05:49.050265869 +0000 UTC m=+3815.625010709" Mar 18 10:05:50 crc kubenswrapper[4778]: I0318 10:05:50.032579 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerStarted","Data":"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89"} Mar 18 10:05:50 crc kubenswrapper[4778]: I0318 10:05:50.053472 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k9mc4" podStartSLOduration=2.595072555 podStartE2EDuration="6.053451446s" podCreationTimestamp="2026-03-18 10:05:44 +0000 UTC" firstStartedPulling="2026-03-18 10:05:45.976980745 +0000 UTC m=+3812.551725585" lastFinishedPulling="2026-03-18 10:05:49.435359636 +0000 UTC m=+3816.010104476" observedRunningTime="2026-03-18 10:05:50.050054054 +0000 UTC m=+3816.624798904" watchObservedRunningTime="2026-03-18 10:05:50.053451446 +0000 UTC m=+3816.628196286" Mar 18 10:05:52 crc kubenswrapper[4778]: I0318 10:05:52.392102 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:52 crc kubenswrapper[4778]: I0318 10:05:52.392481 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:05:53 crc kubenswrapper[4778]: I0318 10:05:53.458305 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w6ds2" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" probeResult="failure" output=< Mar 18 10:05:53 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:05:53 crc kubenswrapper[4778]: > Mar 18 10:05:54 crc kubenswrapper[4778]: I0318 10:05:54.554836 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:54 crc kubenswrapper[4778]: I0318 10:05:54.555160 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:54 crc kubenswrapper[4778]: I0318 10:05:54.608625 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:55 crc kubenswrapper[4778]: I0318 10:05:55.116176 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:55 crc kubenswrapper[4778]: I0318 10:05:55.160666 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:55 crc kubenswrapper[4778]: I0318 10:05:55.187172 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:05:55 crc kubenswrapper[4778]: E0318 10:05:55.187656 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.091411 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k9mc4" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="registry-server" containerID="cri-o://aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89" gracePeriod=2 Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.795566 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.920465 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content\") pod \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.920602 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities\") pod \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.920648 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgqzh\" (UniqueName: \"kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh\") pod \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\" (UID: \"3ceaceba-a6ca-4b0f-8964-8079b9dbb102\") " Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.921444 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities" (OuterVolumeSpecName: "utilities") pod "3ceaceba-a6ca-4b0f-8964-8079b9dbb102" (UID: "3ceaceba-a6ca-4b0f-8964-8079b9dbb102"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.933297 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh" (OuterVolumeSpecName: "kube-api-access-fgqzh") pod "3ceaceba-a6ca-4b0f-8964-8079b9dbb102" (UID: "3ceaceba-a6ca-4b0f-8964-8079b9dbb102"). InnerVolumeSpecName "kube-api-access-fgqzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:05:57 crc kubenswrapper[4778]: I0318 10:05:57.973936 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ceaceba-a6ca-4b0f-8964-8079b9dbb102" (UID: "3ceaceba-a6ca-4b0f-8964-8079b9dbb102"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.022781 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.022818 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgqzh\" (UniqueName: \"kubernetes.io/projected/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-kube-api-access-fgqzh\") on node \"crc\" DevicePath \"\"" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.022830 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceaceba-a6ca-4b0f-8964-8079b9dbb102-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.103049 4778 generic.go:334] "Generic (PLEG): container finished" podID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerID="aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89" exitCode=0 Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.103095 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerDied","Data":"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89"} Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.103129 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9mc4" event={"ID":"3ceaceba-a6ca-4b0f-8964-8079b9dbb102","Type":"ContainerDied","Data":"76f017bfaa7bd549bf50c3e81af429cb8139358679676c8625b5bb5e1d24e86b"} Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.103129 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9mc4" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.103165 4778 scope.go:117] "RemoveContainer" containerID="aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.123703 4778 scope.go:117] "RemoveContainer" containerID="4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.160530 4778 scope.go:117] "RemoveContainer" containerID="71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.167628 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.176271 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k9mc4"] Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.206686 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" path="/var/lib/kubelet/pods/3ceaceba-a6ca-4b0f-8964-8079b9dbb102/volumes" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.208128 4778 scope.go:117] "RemoveContainer" containerID="aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89" Mar 18 10:05:58 crc kubenswrapper[4778]: E0318 10:05:58.208770 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89\": container with ID starting with aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89 not found: ID does not exist" containerID="aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.208804 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89"} err="failed to get container status \"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89\": rpc error: code = NotFound desc = could not find container \"aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89\": container with ID starting with aa64a4bc5d058abb1f890641f82e6aae073faaf7b35069d7b275f7e97a4baf89 not found: ID does not exist" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.208829 4778 scope.go:117] "RemoveContainer" containerID="4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1" Mar 18 10:05:58 crc kubenswrapper[4778]: E0318 10:05:58.209405 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1\": container with ID starting with 4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1 not found: ID does not exist" containerID="4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.209486 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1"} err="failed to get container status \"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1\": rpc error: code = NotFound desc = could not find container \"4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1\": container with ID starting with 4503f57a6a619b03ada2681f3329bde90e4b1b7d27de3ea2c44309051d91a4a1 not found: ID does not exist" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.209543 4778 scope.go:117] "RemoveContainer" containerID="71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db" Mar 18 10:05:58 crc kubenswrapper[4778]: E0318 10:05:58.209989 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db\": container with ID starting with 71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db not found: ID does not exist" containerID="71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db" Mar 18 10:05:58 crc kubenswrapper[4778]: I0318 10:05:58.210028 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db"} err="failed to get container status \"71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db\": rpc error: code = NotFound desc = could not find container \"71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db\": container with ID starting with 71c6b267eb357130fd3f237b4be7977f618ffb7d48114d8994ae1365bb6886db not found: ID does not exist" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.152396 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563806-n2m7x"] Mar 18 10:06:00 crc kubenswrapper[4778]: E0318 10:06:00.152805 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="extract-utilities" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.152820 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="extract-utilities" Mar 18 10:06:00 crc kubenswrapper[4778]: E0318 10:06:00.152842 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="registry-server" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.152850 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="registry-server" Mar 18 10:06:00 crc kubenswrapper[4778]: E0318 10:06:00.152874 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="extract-content" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.152882 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="extract-content" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.153087 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ceaceba-a6ca-4b0f-8964-8079b9dbb102" containerName="registry-server" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.153819 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.157155 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.157229 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.157350 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.172274 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563806-n2m7x"] Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.268638 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwlh2\" (UniqueName: \"kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2\") pod \"auto-csr-approver-29563806-n2m7x\" (UID: \"dbb455c0-90cf-46a9-82c4-1c22d05e007d\") " pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.371753 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwlh2\" (UniqueName: \"kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2\") pod \"auto-csr-approver-29563806-n2m7x\" (UID: \"dbb455c0-90cf-46a9-82c4-1c22d05e007d\") " pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.395768 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwlh2\" (UniqueName: \"kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2\") pod \"auto-csr-approver-29563806-n2m7x\" (UID: \"dbb455c0-90cf-46a9-82c4-1c22d05e007d\") " pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.475934 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:00 crc kubenswrapper[4778]: W0318 10:06:00.937958 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbb455c0_90cf_46a9_82c4_1c22d05e007d.slice/crio-9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159 WatchSource:0}: Error finding container 9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159: Status 404 returned error can't find the container with id 9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159 Mar 18 10:06:00 crc kubenswrapper[4778]: I0318 10:06:00.945765 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563806-n2m7x"] Mar 18 10:06:01 crc kubenswrapper[4778]: I0318 10:06:01.143174 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" event={"ID":"dbb455c0-90cf-46a9-82c4-1c22d05e007d","Type":"ContainerStarted","Data":"9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159"} Mar 18 10:06:03 crc kubenswrapper[4778]: I0318 10:06:03.165746 4778 generic.go:334] "Generic (PLEG): container finished" podID="dbb455c0-90cf-46a9-82c4-1c22d05e007d" containerID="6b16a1172e80d155110d49b222a6dc20e7b21d0a6a9927e8e5966e673f37ac10" exitCode=0 Mar 18 10:06:03 crc kubenswrapper[4778]: I0318 10:06:03.165816 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" event={"ID":"dbb455c0-90cf-46a9-82c4-1c22d05e007d","Type":"ContainerDied","Data":"6b16a1172e80d155110d49b222a6dc20e7b21d0a6a9927e8e5966e673f37ac10"} Mar 18 10:06:03 crc kubenswrapper[4778]: I0318 10:06:03.446810 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w6ds2" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" probeResult="failure" output=< Mar 18 10:06:03 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:06:03 crc kubenswrapper[4778]: > Mar 18 10:06:04 crc kubenswrapper[4778]: I0318 10:06:04.803643 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:04 crc kubenswrapper[4778]: I0318 10:06:04.865893 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwlh2\" (UniqueName: \"kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2\") pod \"dbb455c0-90cf-46a9-82c4-1c22d05e007d\" (UID: \"dbb455c0-90cf-46a9-82c4-1c22d05e007d\") " Mar 18 10:06:04 crc kubenswrapper[4778]: I0318 10:06:04.872680 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2" (OuterVolumeSpecName: "kube-api-access-lwlh2") pod "dbb455c0-90cf-46a9-82c4-1c22d05e007d" (UID: "dbb455c0-90cf-46a9-82c4-1c22d05e007d"). InnerVolumeSpecName "kube-api-access-lwlh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:06:04 crc kubenswrapper[4778]: I0318 10:06:04.968527 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwlh2\" (UniqueName: \"kubernetes.io/projected/dbb455c0-90cf-46a9-82c4-1c22d05e007d-kube-api-access-lwlh2\") on node \"crc\" DevicePath \"\"" Mar 18 10:06:05 crc kubenswrapper[4778]: I0318 10:06:05.183431 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" event={"ID":"dbb455c0-90cf-46a9-82c4-1c22d05e007d","Type":"ContainerDied","Data":"9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159"} Mar 18 10:06:05 crc kubenswrapper[4778]: I0318 10:06:05.183470 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c8d8d17a89e5e3a0a16943424709a6a2b796859194054adb0a830bd877ec159" Mar 18 10:06:05 crc kubenswrapper[4778]: I0318 10:06:05.183474 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563806-n2m7x" Mar 18 10:06:05 crc kubenswrapper[4778]: I0318 10:06:05.882277 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563800-8grkw"] Mar 18 10:06:05 crc kubenswrapper[4778]: I0318 10:06:05.890822 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563800-8grkw"] Mar 18 10:06:06 crc kubenswrapper[4778]: I0318 10:06:06.188796 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:06:06 crc kubenswrapper[4778]: E0318 10:06:06.189165 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:06:06 crc kubenswrapper[4778]: I0318 10:06:06.199589 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7196caa-da0c-4933-b2d0-81c472bed9a9" path="/var/lib/kubelet/pods/b7196caa-da0c-4933-b2d0-81c472bed9a9/volumes" Mar 18 10:06:12 crc kubenswrapper[4778]: I0318 10:06:12.441672 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:06:12 crc kubenswrapper[4778]: I0318 10:06:12.490008 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:06:13 crc kubenswrapper[4778]: I0318 10:06:13.232017 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:06:14 crc kubenswrapper[4778]: I0318 10:06:14.279283 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w6ds2" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" containerID="cri-o://aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792" gracePeriod=2 Mar 18 10:06:14 crc kubenswrapper[4778]: I0318 10:06:14.921578 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.070623 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qnh6\" (UniqueName: \"kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6\") pod \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.070795 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities\") pod \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.070844 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content\") pod \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\" (UID: \"3985ebd1-17ce-47b8-b029-b521e40d6bb2\") " Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.072065 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities" (OuterVolumeSpecName: "utilities") pod "3985ebd1-17ce-47b8-b029-b521e40d6bb2" (UID: "3985ebd1-17ce-47b8-b029-b521e40d6bb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.082460 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6" (OuterVolumeSpecName: "kube-api-access-8qnh6") pod "3985ebd1-17ce-47b8-b029-b521e40d6bb2" (UID: "3985ebd1-17ce-47b8-b029-b521e40d6bb2"). InnerVolumeSpecName "kube-api-access-8qnh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.173116 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.173159 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qnh6\" (UniqueName: \"kubernetes.io/projected/3985ebd1-17ce-47b8-b029-b521e40d6bb2-kube-api-access-8qnh6\") on node \"crc\" DevicePath \"\"" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.228922 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3985ebd1-17ce-47b8-b029-b521e40d6bb2" (UID: "3985ebd1-17ce-47b8-b029-b521e40d6bb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.275739 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3985ebd1-17ce-47b8-b029-b521e40d6bb2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.291425 4778 generic.go:334] "Generic (PLEG): container finished" podID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerID="aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792" exitCode=0 Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.291476 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerDied","Data":"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792"} Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.291512 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6ds2" event={"ID":"3985ebd1-17ce-47b8-b029-b521e40d6bb2","Type":"ContainerDied","Data":"ff1791fd6dc7166759becaa22e236a584f49d5a34e46f1ae3cd70fe242aa8182"} Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.291534 4778 scope.go:117] "RemoveContainer" containerID="aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.292462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6ds2" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.320654 4778 scope.go:117] "RemoveContainer" containerID="025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.335412 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.343030 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w6ds2"] Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.345990 4778 scope.go:117] "RemoveContainer" containerID="a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.412940 4778 scope.go:117] "RemoveContainer" containerID="aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792" Mar 18 10:06:15 crc kubenswrapper[4778]: E0318 10:06:15.415475 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792\": container with ID starting with aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792 not found: ID does not exist" containerID="aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.415519 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792"} err="failed to get container status \"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792\": rpc error: code = NotFound desc = could not find container \"aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792\": container with ID starting with aa60574499c57c6f355101b0cf19aa0310564a6b73cd9704fc3d8c70045cd792 not found: ID does not exist" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.415545 4778 scope.go:117] "RemoveContainer" containerID="025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942" Mar 18 10:06:15 crc kubenswrapper[4778]: E0318 10:06:15.416024 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942\": container with ID starting with 025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942 not found: ID does not exist" containerID="025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.416073 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942"} err="failed to get container status \"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942\": rpc error: code = NotFound desc = could not find container \"025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942\": container with ID starting with 025f161e43851b11f29cac7d50027307d4acb5902fe9fd02060fded046bef942 not found: ID does not exist" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.416100 4778 scope.go:117] "RemoveContainer" containerID="a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156" Mar 18 10:06:15 crc kubenswrapper[4778]: E0318 10:06:15.416446 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156\": container with ID starting with a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156 not found: ID does not exist" containerID="a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156" Mar 18 10:06:15 crc kubenswrapper[4778]: I0318 10:06:15.416472 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156"} err="failed to get container status \"a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156\": rpc error: code = NotFound desc = could not find container \"a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156\": container with ID starting with a1d371aeb3c2daa9b257ba28144c2564882478d29826e8a52354c4353dbbc156 not found: ID does not exist" Mar 18 10:06:16 crc kubenswrapper[4778]: I0318 10:06:16.201794 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" path="/var/lib/kubelet/pods/3985ebd1-17ce-47b8-b029-b521e40d6bb2/volumes" Mar 18 10:06:18 crc kubenswrapper[4778]: I0318 10:06:18.187773 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:06:18 crc kubenswrapper[4778]: E0318 10:06:18.188249 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:06:31 crc kubenswrapper[4778]: I0318 10:06:31.188935 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:06:31 crc kubenswrapper[4778]: E0318 10:06:31.189711 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:06:34 crc kubenswrapper[4778]: I0318 10:06:34.157880 4778 scope.go:117] "RemoveContainer" containerID="52b2bf061001e2a8dfe4b355dbc94585c1d208b337020ae42eb7ee2f487a7b0c" Mar 18 10:06:44 crc kubenswrapper[4778]: I0318 10:06:44.193801 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:06:44 crc kubenswrapper[4778]: E0318 10:06:44.194712 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:06:57 crc kubenswrapper[4778]: I0318 10:06:57.187967 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:06:57 crc kubenswrapper[4778]: E0318 10:06:57.188688 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:07:08 crc kubenswrapper[4778]: I0318 10:07:08.187175 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:07:08 crc kubenswrapper[4778]: E0318 10:07:08.187812 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:07:21 crc kubenswrapper[4778]: I0318 10:07:21.187470 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:07:21 crc kubenswrapper[4778]: E0318 10:07:21.189008 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:07:35 crc kubenswrapper[4778]: I0318 10:07:35.187178 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:07:35 crc kubenswrapper[4778]: E0318 10:07:35.188221 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:07:46 crc kubenswrapper[4778]: I0318 10:07:46.186945 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:07:46 crc kubenswrapper[4778]: E0318 10:07:46.187799 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.154771 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563808-8zn6m"] Mar 18 10:08:00 crc kubenswrapper[4778]: E0318 10:08:00.159767 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="extract-utilities" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160083 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="extract-utilities" Mar 18 10:08:00 crc kubenswrapper[4778]: E0318 10:08:00.160107 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160117 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" Mar 18 10:08:00 crc kubenswrapper[4778]: E0318 10:08:00.160138 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="extract-content" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160148 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="extract-content" Mar 18 10:08:00 crc kubenswrapper[4778]: E0318 10:08:00.160164 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb455c0-90cf-46a9-82c4-1c22d05e007d" containerName="oc" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160171 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb455c0-90cf-46a9-82c4-1c22d05e007d" containerName="oc" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160413 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3985ebd1-17ce-47b8-b029-b521e40d6bb2" containerName="registry-server" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.160434 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb455c0-90cf-46a9-82c4-1c22d05e007d" containerName="oc" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.161272 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.163167 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.163652 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.164222 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.176979 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563808-8zn6m"] Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.187750 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:08:00 crc kubenswrapper[4778]: E0318 10:08:00.187977 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.282772 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms4bw\" (UniqueName: \"kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw\") pod \"auto-csr-approver-29563808-8zn6m\" (UID: \"0a5c01e2-3264-47f4-8081-48235752ef32\") " pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.384054 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms4bw\" (UniqueName: \"kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw\") pod \"auto-csr-approver-29563808-8zn6m\" (UID: \"0a5c01e2-3264-47f4-8081-48235752ef32\") " pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.410986 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms4bw\" (UniqueName: \"kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw\") pod \"auto-csr-approver-29563808-8zn6m\" (UID: \"0a5c01e2-3264-47f4-8081-48235752ef32\") " pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:00 crc kubenswrapper[4778]: I0318 10:08:00.482804 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:01 crc kubenswrapper[4778]: I0318 10:08:01.082515 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563808-8zn6m"] Mar 18 10:08:01 crc kubenswrapper[4778]: I0318 10:08:01.202559 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" event={"ID":"0a5c01e2-3264-47f4-8081-48235752ef32","Type":"ContainerStarted","Data":"cd72fbd5ff8b1d95f025048cf58b89d1f994e5eafbc56a9f42db0e9ae2c10dd2"} Mar 18 10:08:03 crc kubenswrapper[4778]: I0318 10:08:03.220509 4778 generic.go:334] "Generic (PLEG): container finished" podID="0a5c01e2-3264-47f4-8081-48235752ef32" containerID="763e05415a17628575fc7b5a79a4b0a9348cfd1deec11024a98e0e749d405535" exitCode=0 Mar 18 10:08:03 crc kubenswrapper[4778]: I0318 10:08:03.220598 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" event={"ID":"0a5c01e2-3264-47f4-8081-48235752ef32","Type":"ContainerDied","Data":"763e05415a17628575fc7b5a79a4b0a9348cfd1deec11024a98e0e749d405535"} Mar 18 10:08:04 crc kubenswrapper[4778]: I0318 10:08:04.863649 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:04.999808 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms4bw\" (UniqueName: \"kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw\") pod \"0a5c01e2-3264-47f4-8081-48235752ef32\" (UID: \"0a5c01e2-3264-47f4-8081-48235752ef32\") " Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.019384 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw" (OuterVolumeSpecName: "kube-api-access-ms4bw") pod "0a5c01e2-3264-47f4-8081-48235752ef32" (UID: "0a5c01e2-3264-47f4-8081-48235752ef32"). InnerVolumeSpecName "kube-api-access-ms4bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.102307 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms4bw\" (UniqueName: \"kubernetes.io/projected/0a5c01e2-3264-47f4-8081-48235752ef32-kube-api-access-ms4bw\") on node \"crc\" DevicePath \"\"" Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.240923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" event={"ID":"0a5c01e2-3264-47f4-8081-48235752ef32","Type":"ContainerDied","Data":"cd72fbd5ff8b1d95f025048cf58b89d1f994e5eafbc56a9f42db0e9ae2c10dd2"} Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.240966 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd72fbd5ff8b1d95f025048cf58b89d1f994e5eafbc56a9f42db0e9ae2c10dd2" Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.241005 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563808-8zn6m" Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.939294 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563802-2blvs"] Mar 18 10:08:05 crc kubenswrapper[4778]: I0318 10:08:05.950569 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563802-2blvs"] Mar 18 10:08:06 crc kubenswrapper[4778]: I0318 10:08:06.198024 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a9f934-8e78-4c0c-b0cc-59cd49030b5c" path="/var/lib/kubelet/pods/02a9f934-8e78-4c0c-b0cc-59cd49030b5c/volumes" Mar 18 10:08:13 crc kubenswrapper[4778]: I0318 10:08:13.188331 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:08:13 crc kubenswrapper[4778]: E0318 10:08:13.189055 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:08:26 crc kubenswrapper[4778]: I0318 10:08:26.187370 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:08:26 crc kubenswrapper[4778]: E0318 10:08:26.188551 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:08:34 crc kubenswrapper[4778]: I0318 10:08:34.286453 4778 scope.go:117] "RemoveContainer" containerID="048eac64aca1190d343bcd6e5968c051bfb3de1baa3c94a83313a7e1b9b996de" Mar 18 10:08:38 crc kubenswrapper[4778]: I0318 10:08:38.187574 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:08:38 crc kubenswrapper[4778]: E0318 10:08:38.189087 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:08:53 crc kubenswrapper[4778]: I0318 10:08:53.187854 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:08:53 crc kubenswrapper[4778]: E0318 10:08:53.188721 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:09:07 crc kubenswrapper[4778]: I0318 10:09:07.188030 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:09:07 crc kubenswrapper[4778]: E0318 10:09:07.188858 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:09:20 crc kubenswrapper[4778]: I0318 10:09:20.188900 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:09:20 crc kubenswrapper[4778]: E0318 10:09:20.190129 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:09:34 crc kubenswrapper[4778]: I0318 10:09:34.194653 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:09:34 crc kubenswrapper[4778]: I0318 10:09:34.727667 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3"} Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.162162 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563810-gbsth"] Mar 18 10:10:00 crc kubenswrapper[4778]: E0318 10:10:00.163264 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a5c01e2-3264-47f4-8081-48235752ef32" containerName="oc" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.163281 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a5c01e2-3264-47f4-8081-48235752ef32" containerName="oc" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.163514 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a5c01e2-3264-47f4-8081-48235752ef32" containerName="oc" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.164377 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.167539 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.169579 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.169942 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.178934 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563810-gbsth"] Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.267567 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjxh5\" (UniqueName: \"kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5\") pod \"auto-csr-approver-29563810-gbsth\" (UID: \"2da82992-5b46-450f-9fe2-fb1aab2e40a5\") " pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.369592 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjxh5\" (UniqueName: \"kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5\") pod \"auto-csr-approver-29563810-gbsth\" (UID: \"2da82992-5b46-450f-9fe2-fb1aab2e40a5\") " pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.389704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjxh5\" (UniqueName: \"kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5\") pod \"auto-csr-approver-29563810-gbsth\" (UID: \"2da82992-5b46-450f-9fe2-fb1aab2e40a5\") " pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:00 crc kubenswrapper[4778]: I0318 10:10:00.485210 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:01 crc kubenswrapper[4778]: I0318 10:10:01.101428 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563810-gbsth"] Mar 18 10:10:01 crc kubenswrapper[4778]: I0318 10:10:01.973394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563810-gbsth" event={"ID":"2da82992-5b46-450f-9fe2-fb1aab2e40a5","Type":"ContainerStarted","Data":"1ac94c05b44a514cc09d0c7992208feacdd1efd309af86fc6adb31463d969cf4"} Mar 18 10:10:02 crc kubenswrapper[4778]: I0318 10:10:02.982755 4778 generic.go:334] "Generic (PLEG): container finished" podID="2da82992-5b46-450f-9fe2-fb1aab2e40a5" containerID="92b3bdd08fed961c28977d075899ad8197ec334db09149cee7b1c8a99c5b48b6" exitCode=0 Mar 18 10:10:02 crc kubenswrapper[4778]: I0318 10:10:02.982805 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563810-gbsth" event={"ID":"2da82992-5b46-450f-9fe2-fb1aab2e40a5","Type":"ContainerDied","Data":"92b3bdd08fed961c28977d075899ad8197ec334db09149cee7b1c8a99c5b48b6"} Mar 18 10:10:04 crc kubenswrapper[4778]: I0318 10:10:04.528251 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:04 crc kubenswrapper[4778]: I0318 10:10:04.660635 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjxh5\" (UniqueName: \"kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5\") pod \"2da82992-5b46-450f-9fe2-fb1aab2e40a5\" (UID: \"2da82992-5b46-450f-9fe2-fb1aab2e40a5\") " Mar 18 10:10:04 crc kubenswrapper[4778]: I0318 10:10:04.671690 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5" (OuterVolumeSpecName: "kube-api-access-wjxh5") pod "2da82992-5b46-450f-9fe2-fb1aab2e40a5" (UID: "2da82992-5b46-450f-9fe2-fb1aab2e40a5"). InnerVolumeSpecName "kube-api-access-wjxh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:10:04 crc kubenswrapper[4778]: I0318 10:10:04.763387 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjxh5\" (UniqueName: \"kubernetes.io/projected/2da82992-5b46-450f-9fe2-fb1aab2e40a5-kube-api-access-wjxh5\") on node \"crc\" DevicePath \"\"" Mar 18 10:10:05 crc kubenswrapper[4778]: I0318 10:10:05.001689 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563810-gbsth" event={"ID":"2da82992-5b46-450f-9fe2-fb1aab2e40a5","Type":"ContainerDied","Data":"1ac94c05b44a514cc09d0c7992208feacdd1efd309af86fc6adb31463d969cf4"} Mar 18 10:10:05 crc kubenswrapper[4778]: I0318 10:10:05.001746 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ac94c05b44a514cc09d0c7992208feacdd1efd309af86fc6adb31463d969cf4" Mar 18 10:10:05 crc kubenswrapper[4778]: I0318 10:10:05.001812 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563810-gbsth" Mar 18 10:10:05 crc kubenswrapper[4778]: I0318 10:10:05.609171 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563804-fx6ns"] Mar 18 10:10:05 crc kubenswrapper[4778]: I0318 10:10:05.618928 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563804-fx6ns"] Mar 18 10:10:06 crc kubenswrapper[4778]: I0318 10:10:06.199973 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5479124a-5b9d-403a-baf8-0e03ec15c707" path="/var/lib/kubelet/pods/5479124a-5b9d-403a-baf8-0e03ec15c707/volumes" Mar 18 10:10:34 crc kubenswrapper[4778]: I0318 10:10:34.384868 4778 scope.go:117] "RemoveContainer" containerID="9989303583a93d583be4ddef72e12372e2db44e67cf888c7373ff47c4f5bfee9" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.670471 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:37 crc kubenswrapper[4778]: E0318 10:11:37.671567 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da82992-5b46-450f-9fe2-fb1aab2e40a5" containerName="oc" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.671586 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da82992-5b46-450f-9fe2-fb1aab2e40a5" containerName="oc" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.671847 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da82992-5b46-450f-9fe2-fb1aab2e40a5" containerName="oc" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.673569 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.683495 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.851103 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hmb4\" (UniqueName: \"kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.851245 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.851425 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.953296 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hmb4\" (UniqueName: \"kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.953375 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.953450 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.953903 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.954028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:37 crc kubenswrapper[4778]: I0318 10:11:37.973507 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hmb4\" (UniqueName: \"kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4\") pod \"community-operators-9lp4z\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.005742 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.535775 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.816040 4778 generic.go:334] "Generic (PLEG): container finished" podID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerID="12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e" exitCode=0 Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.816146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerDied","Data":"12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e"} Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.816418 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerStarted","Data":"4b279340952032e1de97141653a8272e40b4b4504477e11e39bed863cada10e3"} Mar 18 10:11:38 crc kubenswrapper[4778]: I0318 10:11:38.817737 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:11:39 crc kubenswrapper[4778]: I0318 10:11:39.825458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerStarted","Data":"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2"} Mar 18 10:11:41 crc kubenswrapper[4778]: I0318 10:11:41.842707 4778 generic.go:334] "Generic (PLEG): container finished" podID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerID="cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2" exitCode=0 Mar 18 10:11:41 crc kubenswrapper[4778]: I0318 10:11:41.842929 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerDied","Data":"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2"} Mar 18 10:11:42 crc kubenswrapper[4778]: I0318 10:11:42.856774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerStarted","Data":"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810"} Mar 18 10:11:42 crc kubenswrapper[4778]: I0318 10:11:42.888871 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9lp4z" podStartSLOduration=2.282660185 podStartE2EDuration="5.888852493s" podCreationTimestamp="2026-03-18 10:11:37 +0000 UTC" firstStartedPulling="2026-03-18 10:11:38.817505542 +0000 UTC m=+4165.392250382" lastFinishedPulling="2026-03-18 10:11:42.42369785 +0000 UTC m=+4168.998442690" observedRunningTime="2026-03-18 10:11:42.88690285 +0000 UTC m=+4169.461647690" watchObservedRunningTime="2026-03-18 10:11:42.888852493 +0000 UTC m=+4169.463597333" Mar 18 10:11:48 crc kubenswrapper[4778]: I0318 10:11:48.006224 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:48 crc kubenswrapper[4778]: I0318 10:11:48.006581 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:48 crc kubenswrapper[4778]: I0318 10:11:48.085329 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:48 crc kubenswrapper[4778]: I0318 10:11:48.963405 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:50 crc kubenswrapper[4778]: I0318 10:11:50.418672 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:50 crc kubenswrapper[4778]: I0318 10:11:50.933570 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9lp4z" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="registry-server" containerID="cri-o://d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810" gracePeriod=2 Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.729165 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.881620 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content\") pod \"e86477b8-6733-463c-9c1e-3fdc3af149c8\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.881799 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities\") pod \"e86477b8-6733-463c-9c1e-3fdc3af149c8\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.881938 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hmb4\" (UniqueName: \"kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4\") pod \"e86477b8-6733-463c-9c1e-3fdc3af149c8\" (UID: \"e86477b8-6733-463c-9c1e-3fdc3af149c8\") " Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.882566 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities" (OuterVolumeSpecName: "utilities") pod "e86477b8-6733-463c-9c1e-3fdc3af149c8" (UID: "e86477b8-6733-463c-9c1e-3fdc3af149c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.889259 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4" (OuterVolumeSpecName: "kube-api-access-2hmb4") pod "e86477b8-6733-463c-9c1e-3fdc3af149c8" (UID: "e86477b8-6733-463c-9c1e-3fdc3af149c8"). InnerVolumeSpecName "kube-api-access-2hmb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.936309 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e86477b8-6733-463c-9c1e-3fdc3af149c8" (UID: "e86477b8-6733-463c-9c1e-3fdc3af149c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.945068 4778 generic.go:334] "Generic (PLEG): container finished" podID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerID="d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810" exitCode=0 Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.945121 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerDied","Data":"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810"} Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.945148 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lp4z" event={"ID":"e86477b8-6733-463c-9c1e-3fdc3af149c8","Type":"ContainerDied","Data":"4b279340952032e1de97141653a8272e40b4b4504477e11e39bed863cada10e3"} Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.945165 4778 scope.go:117] "RemoveContainer" containerID="d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.945470 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9lp4z" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.966520 4778 scope.go:117] "RemoveContainer" containerID="cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.983842 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.985309 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.985337 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86477b8-6733-463c-9c1e-3fdc3af149c8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.985354 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hmb4\" (UniqueName: \"kubernetes.io/projected/e86477b8-6733-463c-9c1e-3fdc3af149c8-kube-api-access-2hmb4\") on node \"crc\" DevicePath \"\"" Mar 18 10:11:51 crc kubenswrapper[4778]: I0318 10:11:51.993890 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9lp4z"] Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.008906 4778 scope.go:117] "RemoveContainer" containerID="12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.059363 4778 scope.go:117] "RemoveContainer" containerID="d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810" Mar 18 10:11:52 crc kubenswrapper[4778]: E0318 10:11:52.060334 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810\": container with ID starting with d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810 not found: ID does not exist" containerID="d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.060373 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810"} err="failed to get container status \"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810\": rpc error: code = NotFound desc = could not find container \"d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810\": container with ID starting with d323b45d579ee74110374535069fdc0d9b69b68fe6b24abe8a23fa8732261810 not found: ID does not exist" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.060402 4778 scope.go:117] "RemoveContainer" containerID="cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2" Mar 18 10:11:52 crc kubenswrapper[4778]: E0318 10:11:52.065434 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2\": container with ID starting with cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2 not found: ID does not exist" containerID="cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.065472 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2"} err="failed to get container status \"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2\": rpc error: code = NotFound desc = could not find container \"cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2\": container with ID starting with cbeee733ae3134d7baede63761e66e19b68dafb5fd927a7f4b8372345ee80ea2 not found: ID does not exist" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.065493 4778 scope.go:117] "RemoveContainer" containerID="12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e" Mar 18 10:11:52 crc kubenswrapper[4778]: E0318 10:11:52.066294 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e\": container with ID starting with 12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e not found: ID does not exist" containerID="12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.066316 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e"} err="failed to get container status \"12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e\": rpc error: code = NotFound desc = could not find container \"12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e\": container with ID starting with 12c3314ab9e2be8b20c7400b37a6fc9f4c1807b6d14bff081a5532645e86455e not found: ID does not exist" Mar 18 10:11:52 crc kubenswrapper[4778]: I0318 10:11:52.207088 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" path="/var/lib/kubelet/pods/e86477b8-6733-463c-9c1e-3fdc3af149c8/volumes" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.148181 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.148840 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.156663 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563812-2gvxv"] Mar 18 10:12:00 crc kubenswrapper[4778]: E0318 10:12:00.157218 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="extract-utilities" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.157240 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="extract-utilities" Mar 18 10:12:00 crc kubenswrapper[4778]: E0318 10:12:00.157274 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="registry-server" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.157282 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="registry-server" Mar 18 10:12:00 crc kubenswrapper[4778]: E0318 10:12:00.157302 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="extract-content" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.157308 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="extract-content" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.157498 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86477b8-6733-463c-9c1e-3fdc3af149c8" containerName="registry-server" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.158150 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.162073 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.162192 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.162981 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.167336 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563812-2gvxv"] Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.263368 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjcx\" (UniqueName: \"kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx\") pod \"auto-csr-approver-29563812-2gvxv\" (UID: \"88f0d7a0-c27a-48d5-90f6-e7d7de946731\") " pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.365684 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjcx\" (UniqueName: \"kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx\") pod \"auto-csr-approver-29563812-2gvxv\" (UID: \"88f0d7a0-c27a-48d5-90f6-e7d7de946731\") " pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.383386 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjcx\" (UniqueName: \"kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx\") pod \"auto-csr-approver-29563812-2gvxv\" (UID: \"88f0d7a0-c27a-48d5-90f6-e7d7de946731\") " pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.477306 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:00 crc kubenswrapper[4778]: W0318 10:12:00.972498 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88f0d7a0_c27a_48d5_90f6_e7d7de946731.slice/crio-add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba WatchSource:0}: Error finding container add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba: Status 404 returned error can't find the container with id add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba Mar 18 10:12:00 crc kubenswrapper[4778]: I0318 10:12:00.973551 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563812-2gvxv"] Mar 18 10:12:01 crc kubenswrapper[4778]: I0318 10:12:01.021923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" event={"ID":"88f0d7a0-c27a-48d5-90f6-e7d7de946731","Type":"ContainerStarted","Data":"add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba"} Mar 18 10:12:03 crc kubenswrapper[4778]: I0318 10:12:03.039683 4778 generic.go:334] "Generic (PLEG): container finished" podID="88f0d7a0-c27a-48d5-90f6-e7d7de946731" containerID="4d2240707a2956bb8da6399edaf60b6df2ea8a136aea3c9f29e332c118bf9bc2" exitCode=0 Mar 18 10:12:03 crc kubenswrapper[4778]: I0318 10:12:03.039724 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" event={"ID":"88f0d7a0-c27a-48d5-90f6-e7d7de946731","Type":"ContainerDied","Data":"4d2240707a2956bb8da6399edaf60b6df2ea8a136aea3c9f29e332c118bf9bc2"} Mar 18 10:12:04 crc kubenswrapper[4778]: I0318 10:12:04.557375 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:04 crc kubenswrapper[4778]: I0318 10:12:04.669927 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmjcx\" (UniqueName: \"kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx\") pod \"88f0d7a0-c27a-48d5-90f6-e7d7de946731\" (UID: \"88f0d7a0-c27a-48d5-90f6-e7d7de946731\") " Mar 18 10:12:04 crc kubenswrapper[4778]: I0318 10:12:04.676224 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx" (OuterVolumeSpecName: "kube-api-access-cmjcx") pod "88f0d7a0-c27a-48d5-90f6-e7d7de946731" (UID: "88f0d7a0-c27a-48d5-90f6-e7d7de946731"). InnerVolumeSpecName "kube-api-access-cmjcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:12:04 crc kubenswrapper[4778]: I0318 10:12:04.772784 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmjcx\" (UniqueName: \"kubernetes.io/projected/88f0d7a0-c27a-48d5-90f6-e7d7de946731-kube-api-access-cmjcx\") on node \"crc\" DevicePath \"\"" Mar 18 10:12:05 crc kubenswrapper[4778]: I0318 10:12:05.059826 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" event={"ID":"88f0d7a0-c27a-48d5-90f6-e7d7de946731","Type":"ContainerDied","Data":"add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba"} Mar 18 10:12:05 crc kubenswrapper[4778]: I0318 10:12:05.059865 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="add7f32d41d027f207d24e12f67bedd31323fb435e1a1b972fdefbeaaf6a36ba" Mar 18 10:12:05 crc kubenswrapper[4778]: I0318 10:12:05.059872 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563812-2gvxv" Mar 18 10:12:05 crc kubenswrapper[4778]: I0318 10:12:05.650807 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563806-n2m7x"] Mar 18 10:12:05 crc kubenswrapper[4778]: I0318 10:12:05.659713 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563806-n2m7x"] Mar 18 10:12:06 crc kubenswrapper[4778]: I0318 10:12:06.198955 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbb455c0-90cf-46a9-82c4-1c22d05e007d" path="/var/lib/kubelet/pods/dbb455c0-90cf-46a9-82c4-1c22d05e007d/volumes" Mar 18 10:12:30 crc kubenswrapper[4778]: I0318 10:12:30.147580 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:12:30 crc kubenswrapper[4778]: I0318 10:12:30.148090 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:12:34 crc kubenswrapper[4778]: I0318 10:12:34.476623 4778 scope.go:117] "RemoveContainer" containerID="6b16a1172e80d155110d49b222a6dc20e7b21d0a6a9927e8e5966e673f37ac10" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.920223 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:40 crc kubenswrapper[4778]: E0318 10:12:40.921064 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f0d7a0-c27a-48d5-90f6-e7d7de946731" containerName="oc" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.921075 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f0d7a0-c27a-48d5-90f6-e7d7de946731" containerName="oc" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.921261 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f0d7a0-c27a-48d5-90f6-e7d7de946731" containerName="oc" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.922670 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.932810 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.961892 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.961943 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r427j\" (UniqueName: \"kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:40 crc kubenswrapper[4778]: I0318 10:12:40.961999 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.064810 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r427j\" (UniqueName: \"kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.064861 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.064920 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.065593 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.065680 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.084504 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r427j\" (UniqueName: \"kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j\") pod \"redhat-marketplace-gzjc8\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.268480 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.806816 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:41 crc kubenswrapper[4778]: I0318 10:12:41.923092 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerStarted","Data":"2d6f09685cb2b5ee91a046b3d1408a6415873139bc173fe150fd41b97c3dca02"} Mar 18 10:12:42 crc kubenswrapper[4778]: I0318 10:12:42.938167 4778 generic.go:334] "Generic (PLEG): container finished" podID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerID="b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db" exitCode=0 Mar 18 10:12:42 crc kubenswrapper[4778]: I0318 10:12:42.938354 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerDied","Data":"b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db"} Mar 18 10:12:43 crc kubenswrapper[4778]: I0318 10:12:43.954355 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerStarted","Data":"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2"} Mar 18 10:12:44 crc kubenswrapper[4778]: I0318 10:12:44.966428 4778 generic.go:334] "Generic (PLEG): container finished" podID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerID="0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2" exitCode=0 Mar 18 10:12:44 crc kubenswrapper[4778]: I0318 10:12:44.966715 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerDied","Data":"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2"} Mar 18 10:12:45 crc kubenswrapper[4778]: I0318 10:12:45.978341 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerStarted","Data":"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331"} Mar 18 10:12:46 crc kubenswrapper[4778]: I0318 10:12:46.001704 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gzjc8" podStartSLOduration=3.568313438 podStartE2EDuration="6.001685143s" podCreationTimestamp="2026-03-18 10:12:40 +0000 UTC" firstStartedPulling="2026-03-18 10:12:42.940099911 +0000 UTC m=+4229.514844751" lastFinishedPulling="2026-03-18 10:12:45.373471616 +0000 UTC m=+4231.948216456" observedRunningTime="2026-03-18 10:12:45.998434855 +0000 UTC m=+4232.573179705" watchObservedRunningTime="2026-03-18 10:12:46.001685143 +0000 UTC m=+4232.576429973" Mar 18 10:12:51 crc kubenswrapper[4778]: I0318 10:12:51.269502 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:51 crc kubenswrapper[4778]: I0318 10:12:51.271525 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:51 crc kubenswrapper[4778]: I0318 10:12:51.316224 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:52 crc kubenswrapper[4778]: I0318 10:12:52.081061 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:52 crc kubenswrapper[4778]: I0318 10:12:52.129580 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.047298 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gzjc8" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="registry-server" containerID="cri-o://72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331" gracePeriod=2 Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.705139 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.780540 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities\") pod \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.780754 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r427j\" (UniqueName: \"kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j\") pod \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.780833 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content\") pod \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\" (UID: \"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd\") " Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.783086 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities" (OuterVolumeSpecName: "utilities") pod "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" (UID: "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.804260 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j" (OuterVolumeSpecName: "kube-api-access-r427j") pod "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" (UID: "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd"). InnerVolumeSpecName "kube-api-access-r427j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.826594 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" (UID: "5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.883343 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r427j\" (UniqueName: \"kubernetes.io/projected/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-kube-api-access-r427j\") on node \"crc\" DevicePath \"\"" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.883380 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:12:54 crc kubenswrapper[4778]: I0318 10:12:54.883388 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.061886 4778 generic.go:334] "Generic (PLEG): container finished" podID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerID="72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331" exitCode=0 Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.061932 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerDied","Data":"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331"} Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.061961 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjc8" event={"ID":"5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd","Type":"ContainerDied","Data":"2d6f09685cb2b5ee91a046b3d1408a6415873139bc173fe150fd41b97c3dca02"} Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.061975 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjc8" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.061981 4778 scope.go:117] "RemoveContainer" containerID="72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.078379 4778 scope.go:117] "RemoveContainer" containerID="0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.095355 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.105361 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjc8"] Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.129915 4778 scope.go:117] "RemoveContainer" containerID="b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.158622 4778 scope.go:117] "RemoveContainer" containerID="72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331" Mar 18 10:12:55 crc kubenswrapper[4778]: E0318 10:12:55.159213 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331\": container with ID starting with 72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331 not found: ID does not exist" containerID="72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.159256 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331"} err="failed to get container status \"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331\": rpc error: code = NotFound desc = could not find container \"72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331\": container with ID starting with 72256312e778fd5886cbd81f7a0c20d4e136f5daeba2cd46e42ebd223cbbf331 not found: ID does not exist" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.159281 4778 scope.go:117] "RemoveContainer" containerID="0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2" Mar 18 10:12:55 crc kubenswrapper[4778]: E0318 10:12:55.159595 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2\": container with ID starting with 0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2 not found: ID does not exist" containerID="0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.159620 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2"} err="failed to get container status \"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2\": rpc error: code = NotFound desc = could not find container \"0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2\": container with ID starting with 0e8c2f81e6cd77b4cab8ee9dbd295b01c8c384d1ebc005fed058a57bc23618e2 not found: ID does not exist" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.159635 4778 scope.go:117] "RemoveContainer" containerID="b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db" Mar 18 10:12:55 crc kubenswrapper[4778]: E0318 10:12:55.159846 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db\": container with ID starting with b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db not found: ID does not exist" containerID="b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db" Mar 18 10:12:55 crc kubenswrapper[4778]: I0318 10:12:55.159878 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db"} err="failed to get container status \"b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db\": rpc error: code = NotFound desc = could not find container \"b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db\": container with ID starting with b910a84b81fa09ceba83dbc432aca6f0c253723b902df51521768459b726f0db not found: ID does not exist" Mar 18 10:12:56 crc kubenswrapper[4778]: I0318 10:12:56.203920 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" path="/var/lib/kubelet/pods/5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd/volumes" Mar 18 10:13:00 crc kubenswrapper[4778]: I0318 10:13:00.148182 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:13:00 crc kubenswrapper[4778]: I0318 10:13:00.148825 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:13:00 crc kubenswrapper[4778]: I0318 10:13:00.148878 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:13:00 crc kubenswrapper[4778]: I0318 10:13:00.149810 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:13:00 crc kubenswrapper[4778]: I0318 10:13:00.149889 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3" gracePeriod=600 Mar 18 10:13:01 crc kubenswrapper[4778]: I0318 10:13:01.124645 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3" exitCode=0 Mar 18 10:13:01 crc kubenswrapper[4778]: I0318 10:13:01.124683 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3"} Mar 18 10:13:01 crc kubenswrapper[4778]: I0318 10:13:01.125247 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce"} Mar 18 10:13:01 crc kubenswrapper[4778]: I0318 10:13:01.125272 4778 scope.go:117] "RemoveContainer" containerID="87ace7d842691f42df1dda2447fd9c3904261b343143ae6df3e97cc4d860a190" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.140945 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563814-9vhtv"] Mar 18 10:14:00 crc kubenswrapper[4778]: E0318 10:14:00.141922 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="extract-content" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.141939 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="extract-content" Mar 18 10:14:00 crc kubenswrapper[4778]: E0318 10:14:00.141968 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="registry-server" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.141976 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="registry-server" Mar 18 10:14:00 crc kubenswrapper[4778]: E0318 10:14:00.141991 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="extract-utilities" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.141998 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="extract-utilities" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.142246 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e991b08-0ce3-42f8-bba4-3c3ddc0ee4bd" containerName="registry-server" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.143041 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.145566 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.145587 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.145677 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.162776 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563814-9vhtv"] Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.278342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb8jp\" (UniqueName: \"kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp\") pod \"auto-csr-approver-29563814-9vhtv\" (UID: \"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9\") " pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.380537 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb8jp\" (UniqueName: \"kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp\") pod \"auto-csr-approver-29563814-9vhtv\" (UID: \"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9\") " pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.403327 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb8jp\" (UniqueName: \"kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp\") pod \"auto-csr-approver-29563814-9vhtv\" (UID: \"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9\") " pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.462802 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:00 crc kubenswrapper[4778]: I0318 10:14:00.931180 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563814-9vhtv"] Mar 18 10:14:01 crc kubenswrapper[4778]: I0318 10:14:01.804447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" event={"ID":"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9","Type":"ContainerStarted","Data":"7cf46fde82111a0308d19b744c884483b271af70285848283badab4bdb621026"} Mar 18 10:14:02 crc kubenswrapper[4778]: I0318 10:14:02.814055 4778 generic.go:334] "Generic (PLEG): container finished" podID="96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" containerID="75cf773e9ab812c93a6cb361a559a8ad1ac8bf63ad2f27eb51c3d0a96daa619d" exitCode=0 Mar 18 10:14:02 crc kubenswrapper[4778]: I0318 10:14:02.814151 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" event={"ID":"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9","Type":"ContainerDied","Data":"75cf773e9ab812c93a6cb361a559a8ad1ac8bf63ad2f27eb51c3d0a96daa619d"} Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.296842 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.370477 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb8jp\" (UniqueName: \"kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp\") pod \"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9\" (UID: \"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9\") " Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.376526 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp" (OuterVolumeSpecName: "kube-api-access-nb8jp") pod "96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" (UID: "96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9"). InnerVolumeSpecName "kube-api-access-nb8jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.472377 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb8jp\" (UniqueName: \"kubernetes.io/projected/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9-kube-api-access-nb8jp\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.834342 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" event={"ID":"96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9","Type":"ContainerDied","Data":"7cf46fde82111a0308d19b744c884483b271af70285848283badab4bdb621026"} Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.834397 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cf46fde82111a0308d19b744c884483b271af70285848283badab4bdb621026" Mar 18 10:14:04 crc kubenswrapper[4778]: I0318 10:14:04.834518 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563814-9vhtv" Mar 18 10:14:05 crc kubenswrapper[4778]: I0318 10:14:05.384466 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563808-8zn6m"] Mar 18 10:14:05 crc kubenswrapper[4778]: I0318 10:14:05.394397 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563808-8zn6m"] Mar 18 10:14:06 crc kubenswrapper[4778]: I0318 10:14:06.216989 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a5c01e2-3264-47f4-8081-48235752ef32" path="/var/lib/kubelet/pods/0a5c01e2-3264-47f4-8081-48235752ef32/volumes" Mar 18 10:14:34 crc kubenswrapper[4778]: I0318 10:14:34.621679 4778 scope.go:117] "RemoveContainer" containerID="763e05415a17628575fc7b5a79a4b0a9348cfd1deec11024a98e0e749d405535" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.147503 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.149662 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.154998 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf"] Mar 18 10:15:00 crc kubenswrapper[4778]: E0318 10:15:00.155393 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" containerName="oc" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.155411 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" containerName="oc" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.155651 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" containerName="oc" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.156342 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.160908 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.161471 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.169161 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf"] Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.281570 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wvrq\" (UniqueName: \"kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.282043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.282102 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.383941 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.383999 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.384050 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wvrq\" (UniqueName: \"kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.385401 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.393069 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.401505 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wvrq\" (UniqueName: \"kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq\") pod \"collect-profiles-29563815-xdqjf\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:00 crc kubenswrapper[4778]: I0318 10:15:00.480466 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:01 crc kubenswrapper[4778]: I0318 10:15:01.008383 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf"] Mar 18 10:15:01 crc kubenswrapper[4778]: I0318 10:15:01.340766 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" event={"ID":"1a12e64d-d433-4f42-8aa6-cd1de264b346","Type":"ContainerStarted","Data":"4c308b5ed19066acb80d31a8263c3f25bb04c0935256cdfa497ae0b275b40ad3"} Mar 18 10:15:01 crc kubenswrapper[4778]: I0318 10:15:01.341173 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" event={"ID":"1a12e64d-d433-4f42-8aa6-cd1de264b346","Type":"ContainerStarted","Data":"228cb5dd40dce2c37284b46c1282d4bbad64cdc2aaf29b59be8be0c202159e59"} Mar 18 10:15:01 crc kubenswrapper[4778]: I0318 10:15:01.363368 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" podStartSLOduration=1.363345749 podStartE2EDuration="1.363345749s" podCreationTimestamp="2026-03-18 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:01.356953985 +0000 UTC m=+4367.931698835" watchObservedRunningTime="2026-03-18 10:15:01.363345749 +0000 UTC m=+4367.938090609" Mar 18 10:15:02 crc kubenswrapper[4778]: I0318 10:15:02.351281 4778 generic.go:334] "Generic (PLEG): container finished" podID="1a12e64d-d433-4f42-8aa6-cd1de264b346" containerID="4c308b5ed19066acb80d31a8263c3f25bb04c0935256cdfa497ae0b275b40ad3" exitCode=0 Mar 18 10:15:02 crc kubenswrapper[4778]: I0318 10:15:02.351379 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" event={"ID":"1a12e64d-d433-4f42-8aa6-cd1de264b346","Type":"ContainerDied","Data":"4c308b5ed19066acb80d31a8263c3f25bb04c0935256cdfa497ae0b275b40ad3"} Mar 18 10:15:03 crc kubenswrapper[4778]: I0318 10:15:03.957341 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.062640 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume\") pod \"1a12e64d-d433-4f42-8aa6-cd1de264b346\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.062710 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume\") pod \"1a12e64d-d433-4f42-8aa6-cd1de264b346\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.062797 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wvrq\" (UniqueName: \"kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq\") pod \"1a12e64d-d433-4f42-8aa6-cd1de264b346\" (UID: \"1a12e64d-d433-4f42-8aa6-cd1de264b346\") " Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.063765 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume" (OuterVolumeSpecName: "config-volume") pod "1a12e64d-d433-4f42-8aa6-cd1de264b346" (UID: "1a12e64d-d433-4f42-8aa6-cd1de264b346"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.069078 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1a12e64d-d433-4f42-8aa6-cd1de264b346" (UID: "1a12e64d-d433-4f42-8aa6-cd1de264b346"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.069278 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq" (OuterVolumeSpecName: "kube-api-access-8wvrq") pod "1a12e64d-d433-4f42-8aa6-cd1de264b346" (UID: "1a12e64d-d433-4f42-8aa6-cd1de264b346"). InnerVolumeSpecName "kube-api-access-8wvrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.165303 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a12e64d-d433-4f42-8aa6-cd1de264b346-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.165601 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a12e64d-d433-4f42-8aa6-cd1de264b346-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.165612 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wvrq\" (UniqueName: \"kubernetes.io/projected/1a12e64d-d433-4f42-8aa6-cd1de264b346-kube-api-access-8wvrq\") on node \"crc\" DevicePath \"\"" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.376801 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" event={"ID":"1a12e64d-d433-4f42-8aa6-cd1de264b346","Type":"ContainerDied","Data":"228cb5dd40dce2c37284b46c1282d4bbad64cdc2aaf29b59be8be0c202159e59"} Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.376850 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="228cb5dd40dce2c37284b46c1282d4bbad64cdc2aaf29b59be8be0c202159e59" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.376877 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf" Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.450655 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq"] Mar 18 10:15:04 crc kubenswrapper[4778]: I0318 10:15:04.459398 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-zldnq"] Mar 18 10:15:06 crc kubenswrapper[4778]: I0318 10:15:06.198311 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b2bddb-d94d-426e-bc18-8b864785e323" path="/var/lib/kubelet/pods/b6b2bddb-d94d-426e-bc18-8b864785e323/volumes" Mar 18 10:15:30 crc kubenswrapper[4778]: I0318 10:15:30.147592 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:15:30 crc kubenswrapper[4778]: I0318 10:15:30.148306 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:15:34 crc kubenswrapper[4778]: I0318 10:15:34.702551 4778 scope.go:117] "RemoveContainer" containerID="04359ca445cb3566112d245be577eaabe4ab24e27a18fca03074e13b6e3b403f" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.147085 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.148909 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.147640 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563816-ljgbz"] Mar 18 10:16:00 crc kubenswrapper[4778]: E0318 10:16:00.149547 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a12e64d-d433-4f42-8aa6-cd1de264b346" containerName="collect-profiles" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.149581 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a12e64d-d433-4f42-8aa6-cd1de264b346" containerName="collect-profiles" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.149818 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a12e64d-d433-4f42-8aa6-cd1de264b346" containerName="collect-profiles" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.150533 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.150690 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.151475 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.151535 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" gracePeriod=600 Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.152892 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.153022 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.154276 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.166935 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-ljgbz"] Mar 18 10:16:00 crc kubenswrapper[4778]: E0318 10:16:00.275797 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.318598 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nll7f\" (UniqueName: \"kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f\") pod \"auto-csr-approver-29563816-ljgbz\" (UID: \"62302aba-bf34-4318-9599-2752789a925f\") " pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.420384 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nll7f\" (UniqueName: \"kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f\") pod \"auto-csr-approver-29563816-ljgbz\" (UID: \"62302aba-bf34-4318-9599-2752789a925f\") " pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.443014 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nll7f\" (UniqueName: \"kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f\") pod \"auto-csr-approver-29563816-ljgbz\" (UID: \"62302aba-bf34-4318-9599-2752789a925f\") " pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.473755 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.894045 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" exitCode=0 Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.894447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce"} Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.894513 4778 scope.go:117] "RemoveContainer" containerID="d5e0db80372568acf47512b0578afc67f28216d99c8be88c569e735224dbd2f3" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.895424 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:16:00 crc kubenswrapper[4778]: E0318 10:16:00.895866 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:00 crc kubenswrapper[4778]: I0318 10:16:00.960328 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-ljgbz"] Mar 18 10:16:01 crc kubenswrapper[4778]: I0318 10:16:01.906049 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" event={"ID":"62302aba-bf34-4318-9599-2752789a925f","Type":"ContainerStarted","Data":"da2f06e59903549c6e097532a76a034c46381787a357883f630ac75141535355"} Mar 18 10:16:02 crc kubenswrapper[4778]: I0318 10:16:02.916598 4778 generic.go:334] "Generic (PLEG): container finished" podID="62302aba-bf34-4318-9599-2752789a925f" containerID="86539332f3b2bee69c9852d9c08bf1b20f84cb5d7d5b3975360dc3cdaf5134cb" exitCode=0 Mar 18 10:16:02 crc kubenswrapper[4778]: I0318 10:16:02.916655 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" event={"ID":"62302aba-bf34-4318-9599-2752789a925f","Type":"ContainerDied","Data":"86539332f3b2bee69c9852d9c08bf1b20f84cb5d7d5b3975360dc3cdaf5134cb"} Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.430439 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.507291 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nll7f\" (UniqueName: \"kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f\") pod \"62302aba-bf34-4318-9599-2752789a925f\" (UID: \"62302aba-bf34-4318-9599-2752789a925f\") " Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.519883 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f" (OuterVolumeSpecName: "kube-api-access-nll7f") pod "62302aba-bf34-4318-9599-2752789a925f" (UID: "62302aba-bf34-4318-9599-2752789a925f"). InnerVolumeSpecName "kube-api-access-nll7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.610015 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nll7f\" (UniqueName: \"kubernetes.io/projected/62302aba-bf34-4318-9599-2752789a925f-kube-api-access-nll7f\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.934639 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" event={"ID":"62302aba-bf34-4318-9599-2752789a925f","Type":"ContainerDied","Data":"da2f06e59903549c6e097532a76a034c46381787a357883f630ac75141535355"} Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.934683 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-ljgbz" Mar 18 10:16:04 crc kubenswrapper[4778]: I0318 10:16:04.934686 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da2f06e59903549c6e097532a76a034c46381787a357883f630ac75141535355" Mar 18 10:16:05 crc kubenswrapper[4778]: I0318 10:16:05.513986 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563810-gbsth"] Mar 18 10:16:05 crc kubenswrapper[4778]: I0318 10:16:05.521721 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563810-gbsth"] Mar 18 10:16:06 crc kubenswrapper[4778]: I0318 10:16:06.199428 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da82992-5b46-450f-9fe2-fb1aab2e40a5" path="/var/lib/kubelet/pods/2da82992-5b46-450f-9fe2-fb1aab2e40a5/volumes" Mar 18 10:16:15 crc kubenswrapper[4778]: I0318 10:16:15.187659 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:16:15 crc kubenswrapper[4778]: E0318 10:16:15.188472 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:30 crc kubenswrapper[4778]: I0318 10:16:30.190291 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:16:30 crc kubenswrapper[4778]: E0318 10:16:30.191537 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:34 crc kubenswrapper[4778]: I0318 10:16:34.818098 4778 scope.go:117] "RemoveContainer" containerID="92b3bdd08fed961c28977d075899ad8197ec334db09149cee7b1c8a99c5b48b6" Mar 18 10:16:41 crc kubenswrapper[4778]: I0318 10:16:41.187522 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:16:41 crc kubenswrapper[4778]: E0318 10:16:41.189442 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.793059 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:16:49 crc kubenswrapper[4778]: E0318 10:16:49.794176 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62302aba-bf34-4318-9599-2752789a925f" containerName="oc" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.794194 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="62302aba-bf34-4318-9599-2752789a925f" containerName="oc" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.794464 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="62302aba-bf34-4318-9599-2752789a925f" containerName="oc" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.796155 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.804613 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.942679 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.942850 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxtw\" (UniqueName: \"kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:49 crc kubenswrapper[4778]: I0318 10:16:49.943101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.045559 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.046012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.046114 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxtw\" (UniqueName: \"kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.046148 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.046451 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.191152 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxtw\" (UniqueName: \"kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw\") pod \"redhat-operators-gb2nl\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.417638 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:16:50 crc kubenswrapper[4778]: I0318 10:16:50.907056 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:16:51 crc kubenswrapper[4778]: I0318 10:16:51.370621 4778 generic.go:334] "Generic (PLEG): container finished" podID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerID="796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999" exitCode=0 Mar 18 10:16:51 crc kubenswrapper[4778]: I0318 10:16:51.370820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerDied","Data":"796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999"} Mar 18 10:16:51 crc kubenswrapper[4778]: I0318 10:16:51.370895 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerStarted","Data":"129f20c68023ea235efa0219ab242d6804e3b67294ead1332540e57fdf205fd8"} Mar 18 10:16:51 crc kubenswrapper[4778]: I0318 10:16:51.372546 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:16:53 crc kubenswrapper[4778]: I0318 10:16:53.411516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerStarted","Data":"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3"} Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.191138 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:16:56 crc kubenswrapper[4778]: E0318 10:16:56.192658 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.589271 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.592872 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.616535 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.679448 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.679593 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfwxg\" (UniqueName: \"kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.679673 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.781830 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.781961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfwxg\" (UniqueName: \"kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.782031 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.782304 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.782421 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.802451 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfwxg\" (UniqueName: \"kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg\") pod \"certified-operators-8d89d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:56 crc kubenswrapper[4778]: I0318 10:16:56.940346 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:16:57 crc kubenswrapper[4778]: I0318 10:16:57.625089 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:16:58 crc kubenswrapper[4778]: I0318 10:16:58.474589 4778 generic.go:334] "Generic (PLEG): container finished" podID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerID="1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3" exitCode=0 Mar 18 10:16:58 crc kubenswrapper[4778]: I0318 10:16:58.474650 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerDied","Data":"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3"} Mar 18 10:16:58 crc kubenswrapper[4778]: I0318 10:16:58.476365 4778 generic.go:334] "Generic (PLEG): container finished" podID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerID="16e14b18ce5d99d4bf62f995ec0986acb244f3c54b40176f880b970232b47c61" exitCode=0 Mar 18 10:16:58 crc kubenswrapper[4778]: I0318 10:16:58.476395 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerDied","Data":"16e14b18ce5d99d4bf62f995ec0986acb244f3c54b40176f880b970232b47c61"} Mar 18 10:16:58 crc kubenswrapper[4778]: I0318 10:16:58.476419 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerStarted","Data":"8676964db6e545f0e4da12fd8e93365fda854a811aa027e80da8717a823aad80"} Mar 18 10:17:00 crc kubenswrapper[4778]: I0318 10:17:00.494573 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerStarted","Data":"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75"} Mar 18 10:17:00 crc kubenswrapper[4778]: I0318 10:17:00.498554 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerStarted","Data":"b174d9801b7aa8378e68651c33f14141cf83af4616ac0d44c4331c4c0f2958ac"} Mar 18 10:17:00 crc kubenswrapper[4778]: I0318 10:17:00.525514 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gb2nl" podStartSLOduration=3.994174304 podStartE2EDuration="11.525490246s" podCreationTimestamp="2026-03-18 10:16:49 +0000 UTC" firstStartedPulling="2026-03-18 10:16:51.37231964 +0000 UTC m=+4477.947064480" lastFinishedPulling="2026-03-18 10:16:58.903635582 +0000 UTC m=+4485.478380422" observedRunningTime="2026-03-18 10:17:00.513344497 +0000 UTC m=+4487.088089347" watchObservedRunningTime="2026-03-18 10:17:00.525490246 +0000 UTC m=+4487.100235096" Mar 18 10:17:01 crc kubenswrapper[4778]: I0318 10:17:01.509803 4778 generic.go:334] "Generic (PLEG): container finished" podID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerID="b174d9801b7aa8378e68651c33f14141cf83af4616ac0d44c4331c4c0f2958ac" exitCode=0 Mar 18 10:17:01 crc kubenswrapper[4778]: I0318 10:17:01.509865 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerDied","Data":"b174d9801b7aa8378e68651c33f14141cf83af4616ac0d44c4331c4c0f2958ac"} Mar 18 10:17:02 crc kubenswrapper[4778]: I0318 10:17:02.520355 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerStarted","Data":"1c5faed70007af88626ce40e82dc1953d592c773e281c805638237b1256535ca"} Mar 18 10:17:02 crc kubenswrapper[4778]: I0318 10:17:02.536846 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8d89d" podStartSLOduration=3.118537568 podStartE2EDuration="6.536825984s" podCreationTimestamp="2026-03-18 10:16:56 +0000 UTC" firstStartedPulling="2026-03-18 10:16:58.477424192 +0000 UTC m=+4485.052169042" lastFinishedPulling="2026-03-18 10:17:01.895712618 +0000 UTC m=+4488.470457458" observedRunningTime="2026-03-18 10:17:02.536503746 +0000 UTC m=+4489.111248606" watchObservedRunningTime="2026-03-18 10:17:02.536825984 +0000 UTC m=+4489.111570824" Mar 18 10:17:06 crc kubenswrapper[4778]: I0318 10:17:06.941136 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:06 crc kubenswrapper[4778]: I0318 10:17:06.941691 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:07 crc kubenswrapper[4778]: I0318 10:17:07.001254 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:07 crc kubenswrapper[4778]: I0318 10:17:07.622036 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:07 crc kubenswrapper[4778]: I0318 10:17:07.669086 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:17:09 crc kubenswrapper[4778]: I0318 10:17:09.581500 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8d89d" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="registry-server" containerID="cri-o://1c5faed70007af88626ce40e82dc1953d592c773e281c805638237b1256535ca" gracePeriod=2 Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.423395 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.423775 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.600696 4778 generic.go:334] "Generic (PLEG): container finished" podID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerID="1c5faed70007af88626ce40e82dc1953d592c773e281c805638237b1256535ca" exitCode=0 Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.600744 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerDied","Data":"1c5faed70007af88626ce40e82dc1953d592c773e281c805638237b1256535ca"} Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.852103 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.972181 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities\") pod \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.972501 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content\") pod \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.972608 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfwxg\" (UniqueName: \"kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg\") pod \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\" (UID: \"331c7aeb-0ba9-437d-b1aa-df9880d3f53d\") " Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.972714 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities" (OuterVolumeSpecName: "utilities") pod "331c7aeb-0ba9-437d-b1aa-df9880d3f53d" (UID: "331c7aeb-0ba9-437d-b1aa-df9880d3f53d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.973126 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:10 crc kubenswrapper[4778]: I0318 10:17:10.980545 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg" (OuterVolumeSpecName: "kube-api-access-sfwxg") pod "331c7aeb-0ba9-437d-b1aa-df9880d3f53d" (UID: "331c7aeb-0ba9-437d-b1aa-df9880d3f53d"). InnerVolumeSpecName "kube-api-access-sfwxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.051659 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "331c7aeb-0ba9-437d-b1aa-df9880d3f53d" (UID: "331c7aeb-0ba9-437d-b1aa-df9880d3f53d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.075383 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.075425 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfwxg\" (UniqueName: \"kubernetes.io/projected/331c7aeb-0ba9-437d-b1aa-df9880d3f53d-kube-api-access-sfwxg\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.187453 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:17:11 crc kubenswrapper[4778]: E0318 10:17:11.187706 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.494445 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gb2nl" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="registry-server" probeResult="failure" output=< Mar 18 10:17:11 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:17:11 crc kubenswrapper[4778]: > Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.609492 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d89d" event={"ID":"331c7aeb-0ba9-437d-b1aa-df9880d3f53d","Type":"ContainerDied","Data":"8676964db6e545f0e4da12fd8e93365fda854a811aa027e80da8717a823aad80"} Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.609560 4778 scope.go:117] "RemoveContainer" containerID="1c5faed70007af88626ce40e82dc1953d592c773e281c805638237b1256535ca" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.610587 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d89d" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.641551 4778 scope.go:117] "RemoveContainer" containerID="b174d9801b7aa8378e68651c33f14141cf83af4616ac0d44c4331c4c0f2958ac" Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.643488 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:17:11 crc kubenswrapper[4778]: I0318 10:17:11.653973 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8d89d"] Mar 18 10:17:12 crc kubenswrapper[4778]: I0318 10:17:12.018566 4778 scope.go:117] "RemoveContainer" containerID="16e14b18ce5d99d4bf62f995ec0986acb244f3c54b40176f880b970232b47c61" Mar 18 10:17:12 crc kubenswrapper[4778]: I0318 10:17:12.201949 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" path="/var/lib/kubelet/pods/331c7aeb-0ba9-437d-b1aa-df9880d3f53d/volumes" Mar 18 10:17:20 crc kubenswrapper[4778]: I0318 10:17:20.481294 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:20 crc kubenswrapper[4778]: I0318 10:17:20.541609 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:20 crc kubenswrapper[4778]: I0318 10:17:20.996100 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:17:21 crc kubenswrapper[4778]: I0318 10:17:21.691821 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gb2nl" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="registry-server" containerID="cri-o://429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75" gracePeriod=2 Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.322689 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.408316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content\") pod \"be5e794e-a8d6-4d21-9456-03d0a7a34846\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.408553 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hxtw\" (UniqueName: \"kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw\") pod \"be5e794e-a8d6-4d21-9456-03d0a7a34846\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.408599 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities\") pod \"be5e794e-a8d6-4d21-9456-03d0a7a34846\" (UID: \"be5e794e-a8d6-4d21-9456-03d0a7a34846\") " Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.409798 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities" (OuterVolumeSpecName: "utilities") pod "be5e794e-a8d6-4d21-9456-03d0a7a34846" (UID: "be5e794e-a8d6-4d21-9456-03d0a7a34846"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.511105 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.539500 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be5e794e-a8d6-4d21-9456-03d0a7a34846" (UID: "be5e794e-a8d6-4d21-9456-03d0a7a34846"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.613295 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be5e794e-a8d6-4d21-9456-03d0a7a34846-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.701339 4778 generic.go:334] "Generic (PLEG): container finished" podID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerID="429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75" exitCode=0 Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.701572 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerDied","Data":"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75"} Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.702286 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb2nl" event={"ID":"be5e794e-a8d6-4d21-9456-03d0a7a34846","Type":"ContainerDied","Data":"129f20c68023ea235efa0219ab242d6804e3b67294ead1332540e57fdf205fd8"} Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.702365 4778 scope.go:117] "RemoveContainer" containerID="429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.701684 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb2nl" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.727317 4778 scope.go:117] "RemoveContainer" containerID="1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.904054 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw" (OuterVolumeSpecName: "kube-api-access-5hxtw") pod "be5e794e-a8d6-4d21-9456-03d0a7a34846" (UID: "be5e794e-a8d6-4d21-9456-03d0a7a34846"). InnerVolumeSpecName "kube-api-access-5hxtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.919213 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hxtw\" (UniqueName: \"kubernetes.io/projected/be5e794e-a8d6-4d21-9456-03d0a7a34846-kube-api-access-5hxtw\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:22 crc kubenswrapper[4778]: I0318 10:17:22.925576 4778 scope.go:117] "RemoveContainer" containerID="796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.091507 4778 scope.go:117] "RemoveContainer" containerID="429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75" Mar 18 10:17:23 crc kubenswrapper[4778]: E0318 10:17:23.092051 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75\": container with ID starting with 429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75 not found: ID does not exist" containerID="429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.092102 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75"} err="failed to get container status \"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75\": rpc error: code = NotFound desc = could not find container \"429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75\": container with ID starting with 429548a56cdeb5b04aacb9628e8d19c2acca9c80af66623dacb36b04defc5f75 not found: ID does not exist" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.092138 4778 scope.go:117] "RemoveContainer" containerID="1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3" Mar 18 10:17:23 crc kubenswrapper[4778]: E0318 10:17:23.092760 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3\": container with ID starting with 1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3 not found: ID does not exist" containerID="1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.092871 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3"} err="failed to get container status \"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3\": rpc error: code = NotFound desc = could not find container \"1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3\": container with ID starting with 1ef45fdbc141ce7edb428dc296bb95b2a43f745bec76022c5de1609b6cb5f2e3 not found: ID does not exist" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.092963 4778 scope.go:117] "RemoveContainer" containerID="796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999" Mar 18 10:17:23 crc kubenswrapper[4778]: E0318 10:17:23.093858 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999\": container with ID starting with 796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999 not found: ID does not exist" containerID="796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.093894 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999"} err="failed to get container status \"796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999\": rpc error: code = NotFound desc = could not find container \"796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999\": container with ID starting with 796cab91ff6fbb49e54726dd9952315382dd5cd8a3e3b4a9c4c1be8c6aeb4999 not found: ID does not exist" Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.244359 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:17:23 crc kubenswrapper[4778]: I0318 10:17:23.260474 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gb2nl"] Mar 18 10:17:24 crc kubenswrapper[4778]: I0318 10:17:24.201043 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" path="/var/lib/kubelet/pods/be5e794e-a8d6-4d21-9456-03d0a7a34846/volumes" Mar 18 10:17:25 crc kubenswrapper[4778]: I0318 10:17:25.187549 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:17:25 crc kubenswrapper[4778]: E0318 10:17:25.188285 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:17:40 crc kubenswrapper[4778]: I0318 10:17:40.187111 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:17:40 crc kubenswrapper[4778]: E0318 10:17:40.187956 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:17:52 crc kubenswrapper[4778]: I0318 10:17:52.187157 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:17:52 crc kubenswrapper[4778]: E0318 10:17:52.188088 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.154773 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563818-c776t"] Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155609 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155622 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155635 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155641 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155651 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="extract-content" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155656 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="extract-content" Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155671 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="extract-content" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155677 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="extract-content" Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155690 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="extract-utilities" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155697 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="extract-utilities" Mar 18 10:18:00 crc kubenswrapper[4778]: E0318 10:18:00.155716 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="extract-utilities" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155721 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="extract-utilities" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155889 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5e794e-a8d6-4d21-9456-03d0a7a34846" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.155908 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="331c7aeb-0ba9-437d-b1aa-df9880d3f53d" containerName="registry-server" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.156551 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.159747 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.159823 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.159894 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.172080 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-c776t"] Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.196840 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d28w8\" (UniqueName: \"kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8\") pod \"auto-csr-approver-29563818-c776t\" (UID: \"9b16b969-ac86-4725-910d-797cd1faedc9\") " pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.300910 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d28w8\" (UniqueName: \"kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8\") pod \"auto-csr-approver-29563818-c776t\" (UID: \"9b16b969-ac86-4725-910d-797cd1faedc9\") " pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.320446 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d28w8\" (UniqueName: \"kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8\") pod \"auto-csr-approver-29563818-c776t\" (UID: \"9b16b969-ac86-4725-910d-797cd1faedc9\") " pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.475745 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:00 crc kubenswrapper[4778]: I0318 10:18:00.973398 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-c776t"] Mar 18 10:18:01 crc kubenswrapper[4778]: W0318 10:18:01.012306 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b16b969_ac86_4725_910d_797cd1faedc9.slice/crio-ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1 WatchSource:0}: Error finding container ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1: Status 404 returned error can't find the container with id ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1 Mar 18 10:18:01 crc kubenswrapper[4778]: I0318 10:18:01.076857 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563818-c776t" event={"ID":"9b16b969-ac86-4725-910d-797cd1faedc9","Type":"ContainerStarted","Data":"ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1"} Mar 18 10:18:03 crc kubenswrapper[4778]: I0318 10:18:03.101873 4778 generic.go:334] "Generic (PLEG): container finished" podID="9b16b969-ac86-4725-910d-797cd1faedc9" containerID="ecf9caf224b383664e625024d70d285619a974017bb319f2a60a8627b5e0d68b" exitCode=0 Mar 18 10:18:03 crc kubenswrapper[4778]: I0318 10:18:03.101963 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563818-c776t" event={"ID":"9b16b969-ac86-4725-910d-797cd1faedc9","Type":"ContainerDied","Data":"ecf9caf224b383664e625024d70d285619a974017bb319f2a60a8627b5e0d68b"} Mar 18 10:18:04 crc kubenswrapper[4778]: I0318 10:18:04.795054 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:04 crc kubenswrapper[4778]: I0318 10:18:04.901752 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d28w8\" (UniqueName: \"kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8\") pod \"9b16b969-ac86-4725-910d-797cd1faedc9\" (UID: \"9b16b969-ac86-4725-910d-797cd1faedc9\") " Mar 18 10:18:04 crc kubenswrapper[4778]: I0318 10:18:04.907603 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8" (OuterVolumeSpecName: "kube-api-access-d28w8") pod "9b16b969-ac86-4725-910d-797cd1faedc9" (UID: "9b16b969-ac86-4725-910d-797cd1faedc9"). InnerVolumeSpecName "kube-api-access-d28w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.003930 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d28w8\" (UniqueName: \"kubernetes.io/projected/9b16b969-ac86-4725-910d-797cd1faedc9-kube-api-access-d28w8\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.121137 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563818-c776t" event={"ID":"9b16b969-ac86-4725-910d-797cd1faedc9","Type":"ContainerDied","Data":"ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1"} Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.121617 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad523834ef8921b31b24422ee6b957140ac8d58bab55ffa29e223c1c2b5d82f1" Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.121393 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-c776t" Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.864529 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563812-2gvxv"] Mar 18 10:18:05 crc kubenswrapper[4778]: I0318 10:18:05.871222 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563812-2gvxv"] Mar 18 10:18:06 crc kubenswrapper[4778]: I0318 10:18:06.197975 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f0d7a0-c27a-48d5-90f6-e7d7de946731" path="/var/lib/kubelet/pods/88f0d7a0-c27a-48d5-90f6-e7d7de946731/volumes" Mar 18 10:18:07 crc kubenswrapper[4778]: I0318 10:18:07.188591 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:18:07 crc kubenswrapper[4778]: E0318 10:18:07.188967 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:18:20 crc kubenswrapper[4778]: I0318 10:18:20.187094 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:18:20 crc kubenswrapper[4778]: E0318 10:18:20.187998 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:18:31 crc kubenswrapper[4778]: I0318 10:18:31.186966 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:18:31 crc kubenswrapper[4778]: E0318 10:18:31.187817 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:18:34 crc kubenswrapper[4778]: I0318 10:18:34.939909 4778 scope.go:117] "RemoveContainer" containerID="4d2240707a2956bb8da6399edaf60b6df2ea8a136aea3c9f29e332c118bf9bc2" Mar 18 10:18:45 crc kubenswrapper[4778]: I0318 10:18:45.186985 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:18:45 crc kubenswrapper[4778]: E0318 10:18:45.187754 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:19:00 crc kubenswrapper[4778]: I0318 10:19:00.188225 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:19:00 crc kubenswrapper[4778]: E0318 10:19:00.189345 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:19:12 crc kubenswrapper[4778]: I0318 10:19:12.187090 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:19:12 crc kubenswrapper[4778]: E0318 10:19:12.187948 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:19:23 crc kubenswrapper[4778]: I0318 10:19:23.188213 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:19:23 crc kubenswrapper[4778]: E0318 10:19:23.189013 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:19:36 crc kubenswrapper[4778]: I0318 10:19:36.187032 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:19:36 crc kubenswrapper[4778]: E0318 10:19:36.188020 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:19:48 crc kubenswrapper[4778]: I0318 10:19:48.187328 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:19:48 crc kubenswrapper[4778]: E0318 10:19:48.188046 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.144906 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563820-xjqs9"] Mar 18 10:20:00 crc kubenswrapper[4778]: E0318 10:20:00.146370 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b16b969-ac86-4725-910d-797cd1faedc9" containerName="oc" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.146392 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b16b969-ac86-4725-910d-797cd1faedc9" containerName="oc" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.146610 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b16b969-ac86-4725-910d-797cd1faedc9" containerName="oc" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.147479 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.150298 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.150372 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.151889 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.157119 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-xjqs9"] Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.304115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8km7c\" (UniqueName: \"kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c\") pod \"auto-csr-approver-29563820-xjqs9\" (UID: \"0b4b190c-f80e-4256-9025-f04279c3b3db\") " pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.406403 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8km7c\" (UniqueName: \"kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c\") pod \"auto-csr-approver-29563820-xjqs9\" (UID: \"0b4b190c-f80e-4256-9025-f04279c3b3db\") " pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.427902 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8km7c\" (UniqueName: \"kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c\") pod \"auto-csr-approver-29563820-xjqs9\" (UID: \"0b4b190c-f80e-4256-9025-f04279c3b3db\") " pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.466023 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:00 crc kubenswrapper[4778]: I0318 10:20:00.926735 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-xjqs9"] Mar 18 10:20:01 crc kubenswrapper[4778]: I0318 10:20:01.177322 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" event={"ID":"0b4b190c-f80e-4256-9025-f04279c3b3db","Type":"ContainerStarted","Data":"0ebb7dc41081892e638316fbe1956fb7acd1ee464e83067dd5b224734c60d953"} Mar 18 10:20:02 crc kubenswrapper[4778]: I0318 10:20:02.185521 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" event={"ID":"0b4b190c-f80e-4256-9025-f04279c3b3db","Type":"ContainerStarted","Data":"f18fd314b43b902914cecf41007b7ceab63380321a7a2e1eeb69b1d4c6a07a98"} Mar 18 10:20:02 crc kubenswrapper[4778]: I0318 10:20:02.220937 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" podStartSLOduration=1.331802742 podStartE2EDuration="2.220913156s" podCreationTimestamp="2026-03-18 10:20:00 +0000 UTC" firstStartedPulling="2026-03-18 10:20:00.934816646 +0000 UTC m=+4667.509561486" lastFinishedPulling="2026-03-18 10:20:01.82392706 +0000 UTC m=+4668.398671900" observedRunningTime="2026-03-18 10:20:02.20594245 +0000 UTC m=+4668.780687310" watchObservedRunningTime="2026-03-18 10:20:02.220913156 +0000 UTC m=+4668.795657996" Mar 18 10:20:03 crc kubenswrapper[4778]: I0318 10:20:03.187116 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:20:03 crc kubenswrapper[4778]: E0318 10:20:03.187806 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:20:03 crc kubenswrapper[4778]: I0318 10:20:03.214187 4778 generic.go:334] "Generic (PLEG): container finished" podID="0b4b190c-f80e-4256-9025-f04279c3b3db" containerID="f18fd314b43b902914cecf41007b7ceab63380321a7a2e1eeb69b1d4c6a07a98" exitCode=0 Mar 18 10:20:03 crc kubenswrapper[4778]: I0318 10:20:03.214291 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" event={"ID":"0b4b190c-f80e-4256-9025-f04279c3b3db","Type":"ContainerDied","Data":"f18fd314b43b902914cecf41007b7ceab63380321a7a2e1eeb69b1d4c6a07a98"} Mar 18 10:20:04 crc kubenswrapper[4778]: I0318 10:20:04.787441 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:04 crc kubenswrapper[4778]: I0318 10:20:04.895990 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8km7c\" (UniqueName: \"kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c\") pod \"0b4b190c-f80e-4256-9025-f04279c3b3db\" (UID: \"0b4b190c-f80e-4256-9025-f04279c3b3db\") " Mar 18 10:20:04 crc kubenswrapper[4778]: I0318 10:20:04.909615 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c" (OuterVolumeSpecName: "kube-api-access-8km7c") pod "0b4b190c-f80e-4256-9025-f04279c3b3db" (UID: "0b4b190c-f80e-4256-9025-f04279c3b3db"). InnerVolumeSpecName "kube-api-access-8km7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:20:04 crc kubenswrapper[4778]: I0318 10:20:04.998664 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8km7c\" (UniqueName: \"kubernetes.io/projected/0b4b190c-f80e-4256-9025-f04279c3b3db-kube-api-access-8km7c\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:05 crc kubenswrapper[4778]: I0318 10:20:05.231783 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" event={"ID":"0b4b190c-f80e-4256-9025-f04279c3b3db","Type":"ContainerDied","Data":"0ebb7dc41081892e638316fbe1956fb7acd1ee464e83067dd5b224734c60d953"} Mar 18 10:20:05 crc kubenswrapper[4778]: I0318 10:20:05.232086 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ebb7dc41081892e638316fbe1956fb7acd1ee464e83067dd5b224734c60d953" Mar 18 10:20:05 crc kubenswrapper[4778]: I0318 10:20:05.232153 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-xjqs9" Mar 18 10:20:05 crc kubenswrapper[4778]: I0318 10:20:05.283125 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563814-9vhtv"] Mar 18 10:20:05 crc kubenswrapper[4778]: I0318 10:20:05.292300 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563814-9vhtv"] Mar 18 10:20:06 crc kubenswrapper[4778]: I0318 10:20:06.196809 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9" path="/var/lib/kubelet/pods/96806f8b-87e9-4d93-b8a2-6f0ccd0ed0b9/volumes" Mar 18 10:20:17 crc kubenswrapper[4778]: I0318 10:20:17.187381 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:20:17 crc kubenswrapper[4778]: E0318 10:20:17.190417 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:20:32 crc kubenswrapper[4778]: I0318 10:20:32.187829 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:20:32 crc kubenswrapper[4778]: E0318 10:20:32.188489 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:20:35 crc kubenswrapper[4778]: I0318 10:20:35.024650 4778 scope.go:117] "RemoveContainer" containerID="75cf773e9ab812c93a6cb361a559a8ad1ac8bf63ad2f27eb51c3d0a96daa619d" Mar 18 10:20:43 crc kubenswrapper[4778]: I0318 10:20:43.187161 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:20:43 crc kubenswrapper[4778]: E0318 10:20:43.187945 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:20:54 crc kubenswrapper[4778]: I0318 10:20:54.194752 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:20:54 crc kubenswrapper[4778]: E0318 10:20:54.196153 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:21:07 crc kubenswrapper[4778]: I0318 10:21:07.187752 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:21:07 crc kubenswrapper[4778]: I0318 10:21:07.785548 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215"} Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.076937 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:21:51 crc kubenswrapper[4778]: E0318 10:21:51.078008 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b4b190c-f80e-4256-9025-f04279c3b3db" containerName="oc" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.078027 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b4b190c-f80e-4256-9025-f04279c3b3db" containerName="oc" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.078308 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b4b190c-f80e-4256-9025-f04279c3b3db" containerName="oc" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.079986 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.110458 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.148448 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.148509 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.148532 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxdj\" (UniqueName: \"kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.250624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.250664 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.250683 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxdj\" (UniqueName: \"kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.251267 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.251417 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.275107 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxdj\" (UniqueName: \"kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj\") pod \"community-operators-98lf9\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.422941 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:21:51 crc kubenswrapper[4778]: I0318 10:21:51.939875 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:21:52 crc kubenswrapper[4778]: I0318 10:21:52.172729 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerStarted","Data":"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87"} Mar 18 10:21:52 crc kubenswrapper[4778]: I0318 10:21:52.172793 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerStarted","Data":"0f2d7eade11a0d10668484020d0ce0c1eaf35772fb28ce8aca1004ed1ac02bb6"} Mar 18 10:21:53 crc kubenswrapper[4778]: I0318 10:21:53.181723 4778 generic.go:334] "Generic (PLEG): container finished" podID="259d9d60-84b8-48a1-844f-734126616467" containerID="8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87" exitCode=0 Mar 18 10:21:53 crc kubenswrapper[4778]: I0318 10:21:53.181817 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerDied","Data":"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87"} Mar 18 10:21:53 crc kubenswrapper[4778]: I0318 10:21:53.184954 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:21:55 crc kubenswrapper[4778]: I0318 10:21:55.203223 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerStarted","Data":"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436"} Mar 18 10:21:56 crc kubenswrapper[4778]: E0318 10:21:56.656594 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod259d9d60_84b8_48a1_844f_734126616467.slice/crio-d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod259d9d60_84b8_48a1_844f_734126616467.slice/crio-conmon-d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436.scope\": RecentStats: unable to find data in memory cache]" Mar 18 10:21:57 crc kubenswrapper[4778]: I0318 10:21:57.221795 4778 generic.go:334] "Generic (PLEG): container finished" podID="259d9d60-84b8-48a1-844f-734126616467" containerID="d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436" exitCode=0 Mar 18 10:21:57 crc kubenswrapper[4778]: I0318 10:21:57.222065 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerDied","Data":"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436"} Mar 18 10:21:58 crc kubenswrapper[4778]: I0318 10:21:58.232592 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerStarted","Data":"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b"} Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.149289 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-98lf9" podStartSLOduration=4.723572277 podStartE2EDuration="9.149271344s" podCreationTimestamp="2026-03-18 10:21:51 +0000 UTC" firstStartedPulling="2026-03-18 10:21:53.184741534 +0000 UTC m=+4779.759486374" lastFinishedPulling="2026-03-18 10:21:57.610440591 +0000 UTC m=+4784.185185441" observedRunningTime="2026-03-18 10:21:58.24943365 +0000 UTC m=+4784.824178510" watchObservedRunningTime="2026-03-18 10:22:00.149271344 +0000 UTC m=+4786.724016184" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.156226 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563822-hg65s"] Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.157832 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.161692 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.162096 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.162276 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.165149 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-hg65s"] Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.234742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfspn\" (UniqueName: \"kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn\") pod \"auto-csr-approver-29563822-hg65s\" (UID: \"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8\") " pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.336014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfspn\" (UniqueName: \"kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn\") pod \"auto-csr-approver-29563822-hg65s\" (UID: \"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8\") " pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.355869 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfspn\" (UniqueName: \"kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn\") pod \"auto-csr-approver-29563822-hg65s\" (UID: \"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8\") " pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.481831 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:00 crc kubenswrapper[4778]: I0318 10:22:00.931928 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-hg65s"] Mar 18 10:22:01 crc kubenswrapper[4778]: I0318 10:22:01.260020 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563822-hg65s" event={"ID":"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8","Type":"ContainerStarted","Data":"322fe25189f2d4f140f8c79b8a68cba0920aa3555453f9a2af1fd68c796840fa"} Mar 18 10:22:01 crc kubenswrapper[4778]: I0318 10:22:01.424386 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:01 crc kubenswrapper[4778]: I0318 10:22:01.424463 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:01 crc kubenswrapper[4778]: I0318 10:22:01.474657 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:02 crc kubenswrapper[4778]: I0318 10:22:02.323651 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:02 crc kubenswrapper[4778]: I0318 10:22:02.384309 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:22:03 crc kubenswrapper[4778]: I0318 10:22:03.281920 4778 generic.go:334] "Generic (PLEG): container finished" podID="b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" containerID="23fd2e1aeae3db5eb1358b08c80f32e6a7b3195affe83ad0ba4169976ea65d12" exitCode=0 Mar 18 10:22:03 crc kubenswrapper[4778]: I0318 10:22:03.282000 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563822-hg65s" event={"ID":"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8","Type":"ContainerDied","Data":"23fd2e1aeae3db5eb1358b08c80f32e6a7b3195affe83ad0ba4169976ea65d12"} Mar 18 10:22:04 crc kubenswrapper[4778]: I0318 10:22:04.308330 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-98lf9" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="registry-server" containerID="cri-o://0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b" gracePeriod=2 Mar 18 10:22:04 crc kubenswrapper[4778]: I0318 10:22:04.819715 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:04 crc kubenswrapper[4778]: I0318 10:22:04.934342 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfspn\" (UniqueName: \"kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn\") pod \"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8\" (UID: \"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8\") " Mar 18 10:22:04 crc kubenswrapper[4778]: I0318 10:22:04.988937 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn" (OuterVolumeSpecName: "kube-api-access-wfspn") pod "b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" (UID: "b71bd8a1-ed53-4e72-8316-7bf3774ee1d8"). InnerVolumeSpecName "kube-api-access-wfspn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.036636 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfspn\" (UniqueName: \"kubernetes.io/projected/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8-kube-api-access-wfspn\") on node \"crc\" DevicePath \"\"" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.104013 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.241046 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrxdj\" (UniqueName: \"kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj\") pod \"259d9d60-84b8-48a1-844f-734126616467\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.241409 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities\") pod \"259d9d60-84b8-48a1-844f-734126616467\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.241531 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content\") pod \"259d9d60-84b8-48a1-844f-734126616467\" (UID: \"259d9d60-84b8-48a1-844f-734126616467\") " Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.242163 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities" (OuterVolumeSpecName: "utilities") pod "259d9d60-84b8-48a1-844f-734126616467" (UID: "259d9d60-84b8-48a1-844f-734126616467"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.245649 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj" (OuterVolumeSpecName: "kube-api-access-zrxdj") pod "259d9d60-84b8-48a1-844f-734126616467" (UID: "259d9d60-84b8-48a1-844f-734126616467"). InnerVolumeSpecName "kube-api-access-zrxdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.287281 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "259d9d60-84b8-48a1-844f-734126616467" (UID: "259d9d60-84b8-48a1-844f-734126616467"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.318902 4778 generic.go:334] "Generic (PLEG): container finished" podID="259d9d60-84b8-48a1-844f-734126616467" containerID="0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b" exitCode=0 Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.318972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerDied","Data":"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b"} Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.319007 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lf9" event={"ID":"259d9d60-84b8-48a1-844f-734126616467","Type":"ContainerDied","Data":"0f2d7eade11a0d10668484020d0ce0c1eaf35772fb28ce8aca1004ed1ac02bb6"} Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.319032 4778 scope.go:117] "RemoveContainer" containerID="0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.319179 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98lf9" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.327730 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563822-hg65s" event={"ID":"b71bd8a1-ed53-4e72-8316-7bf3774ee1d8","Type":"ContainerDied","Data":"322fe25189f2d4f140f8c79b8a68cba0920aa3555453f9a2af1fd68c796840fa"} Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.327864 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="322fe25189f2d4f140f8c79b8a68cba0920aa3555453f9a2af1fd68c796840fa" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.327932 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-hg65s" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.344746 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrxdj\" (UniqueName: \"kubernetes.io/projected/259d9d60-84b8-48a1-844f-734126616467-kube-api-access-zrxdj\") on node \"crc\" DevicePath \"\"" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.344776 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.344785 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259d9d60-84b8-48a1-844f-734126616467-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.356224 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.357968 4778 scope.go:117] "RemoveContainer" containerID="d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.368107 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-98lf9"] Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.381753 4778 scope.go:117] "RemoveContainer" containerID="8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.399908 4778 scope.go:117] "RemoveContainer" containerID="0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b" Mar 18 10:22:05 crc kubenswrapper[4778]: E0318 10:22:05.400358 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b\": container with ID starting with 0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b not found: ID does not exist" containerID="0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.400406 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b"} err="failed to get container status \"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b\": rpc error: code = NotFound desc = could not find container \"0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b\": container with ID starting with 0482ce3c9c6eea5bc28c8bee4084b349125ae2f7324c0b66d3f610b25797257b not found: ID does not exist" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.400428 4778 scope.go:117] "RemoveContainer" containerID="d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436" Mar 18 10:22:05 crc kubenswrapper[4778]: E0318 10:22:05.400790 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436\": container with ID starting with d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436 not found: ID does not exist" containerID="d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.400845 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436"} err="failed to get container status \"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436\": rpc error: code = NotFound desc = could not find container \"d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436\": container with ID starting with d0b8e0ab8e502febd864173329ba03f028865f9fb7442f6129f2ccc6b6334436 not found: ID does not exist" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.400860 4778 scope.go:117] "RemoveContainer" containerID="8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87" Mar 18 10:22:05 crc kubenswrapper[4778]: E0318 10:22:05.401098 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87\": container with ID starting with 8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87 not found: ID does not exist" containerID="8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.401145 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87"} err="failed to get container status \"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87\": rpc error: code = NotFound desc = could not find container \"8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87\": container with ID starting with 8cf7543ce953166510227437e171b6069e72c36fe482f6339a6ef42d3b4c6e87 not found: ID does not exist" Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.897738 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-ljgbz"] Mar 18 10:22:05 crc kubenswrapper[4778]: I0318 10:22:05.911775 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-ljgbz"] Mar 18 10:22:06 crc kubenswrapper[4778]: I0318 10:22:06.198957 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="259d9d60-84b8-48a1-844f-734126616467" path="/var/lib/kubelet/pods/259d9d60-84b8-48a1-844f-734126616467/volumes" Mar 18 10:22:06 crc kubenswrapper[4778]: I0318 10:22:06.199937 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62302aba-bf34-4318-9599-2752789a925f" path="/var/lib/kubelet/pods/62302aba-bf34-4318-9599-2752789a925f/volumes" Mar 18 10:22:35 crc kubenswrapper[4778]: I0318 10:22:35.115758 4778 scope.go:117] "RemoveContainer" containerID="86539332f3b2bee69c9852d9c08bf1b20f84cb5d7d5b3975360dc3cdaf5134cb" Mar 18 10:23:30 crc kubenswrapper[4778]: I0318 10:23:30.147340 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:23:30 crc kubenswrapper[4778]: I0318 10:23:30.147953 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.142756 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563824-cgc65"] Mar 18 10:24:00 crc kubenswrapper[4778]: E0318 10:24:00.143963 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="extract-content" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.143986 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="extract-content" Mar 18 10:24:00 crc kubenswrapper[4778]: E0318 10:24:00.143999 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="registry-server" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.144007 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="registry-server" Mar 18 10:24:00 crc kubenswrapper[4778]: E0318 10:24:00.144035 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" containerName="oc" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.144044 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" containerName="oc" Mar 18 10:24:00 crc kubenswrapper[4778]: E0318 10:24:00.144075 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="extract-utilities" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.144084 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="extract-utilities" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.144381 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="259d9d60-84b8-48a1-844f-734126616467" containerName="registry-server" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.144411 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" containerName="oc" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.145529 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.147575 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.147635 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.148005 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.148821 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.149155 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.156044 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-cgc65"] Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.296835 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74rgk\" (UniqueName: \"kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk\") pod \"auto-csr-approver-29563824-cgc65\" (UID: \"a10aa4ea-573d-4956-953f-4bdef827448d\") " pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.399137 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74rgk\" (UniqueName: \"kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk\") pod \"auto-csr-approver-29563824-cgc65\" (UID: \"a10aa4ea-573d-4956-953f-4bdef827448d\") " pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.432673 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74rgk\" (UniqueName: \"kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk\") pod \"auto-csr-approver-29563824-cgc65\" (UID: \"a10aa4ea-573d-4956-953f-4bdef827448d\") " pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.465463 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:00 crc kubenswrapper[4778]: I0318 10:24:00.976365 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-cgc65"] Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.036183 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.038665 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.048481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.214319 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.214698 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.214755 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74tfk\" (UniqueName: \"kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.316337 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.316381 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74tfk\" (UniqueName: \"kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.316497 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.317322 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.317370 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.337676 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74tfk\" (UniqueName: \"kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk\") pod \"redhat-marketplace-5h69g\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.375907 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.496620 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-cgc65" event={"ID":"a10aa4ea-573d-4956-953f-4bdef827448d","Type":"ContainerStarted","Data":"9adcc651841abf3955ab90d6ae894f7a37101852ea93099acf016628a334ccef"} Mar 18 10:24:01 crc kubenswrapper[4778]: I0318 10:24:01.836259 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:02 crc kubenswrapper[4778]: I0318 10:24:02.507363 4778 generic.go:334] "Generic (PLEG): container finished" podID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerID="cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304" exitCode=0 Mar 18 10:24:02 crc kubenswrapper[4778]: I0318 10:24:02.507460 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerDied","Data":"cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304"} Mar 18 10:24:02 crc kubenswrapper[4778]: I0318 10:24:02.507740 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerStarted","Data":"9ffd33c3c22cf4a977b61b12575074605267705c1e086f0ed6e287067b8b808d"} Mar 18 10:24:03 crc kubenswrapper[4778]: I0318 10:24:03.518587 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-cgc65" event={"ID":"a10aa4ea-573d-4956-953f-4bdef827448d","Type":"ContainerStarted","Data":"5830290d694881dc2acd7c8637c4816f24221da7b703db7749adb1ec9ce95a1b"} Mar 18 10:24:03 crc kubenswrapper[4778]: I0318 10:24:03.541044 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563824-cgc65" podStartSLOduration=2.366704243 podStartE2EDuration="3.541023841s" podCreationTimestamp="2026-03-18 10:24:00 +0000 UTC" firstStartedPulling="2026-03-18 10:24:01.302359297 +0000 UTC m=+4907.877104147" lastFinishedPulling="2026-03-18 10:24:02.476678895 +0000 UTC m=+4909.051423745" observedRunningTime="2026-03-18 10:24:03.538765539 +0000 UTC m=+4910.113510399" watchObservedRunningTime="2026-03-18 10:24:03.541023841 +0000 UTC m=+4910.115768681" Mar 18 10:24:04 crc kubenswrapper[4778]: I0318 10:24:04.528624 4778 generic.go:334] "Generic (PLEG): container finished" podID="a10aa4ea-573d-4956-953f-4bdef827448d" containerID="5830290d694881dc2acd7c8637c4816f24221da7b703db7749adb1ec9ce95a1b" exitCode=0 Mar 18 10:24:04 crc kubenswrapper[4778]: I0318 10:24:04.528853 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-cgc65" event={"ID":"a10aa4ea-573d-4956-953f-4bdef827448d","Type":"ContainerDied","Data":"5830290d694881dc2acd7c8637c4816f24221da7b703db7749adb1ec9ce95a1b"} Mar 18 10:24:04 crc kubenswrapper[4778]: I0318 10:24:04.530763 4778 generic.go:334] "Generic (PLEG): container finished" podID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerID="58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a" exitCode=0 Mar 18 10:24:04 crc kubenswrapper[4778]: I0318 10:24:04.530798 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerDied","Data":"58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a"} Mar 18 10:24:05 crc kubenswrapper[4778]: I0318 10:24:05.540844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerStarted","Data":"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383"} Mar 18 10:24:05 crc kubenswrapper[4778]: I0318 10:24:05.570804 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5h69g" podStartSLOduration=2.021902121 podStartE2EDuration="4.570783798s" podCreationTimestamp="2026-03-18 10:24:01 +0000 UTC" firstStartedPulling="2026-03-18 10:24:02.509041082 +0000 UTC m=+4909.083785932" lastFinishedPulling="2026-03-18 10:24:05.057922769 +0000 UTC m=+4911.632667609" observedRunningTime="2026-03-18 10:24:05.563019448 +0000 UTC m=+4912.137764288" watchObservedRunningTime="2026-03-18 10:24:05.570783798 +0000 UTC m=+4912.145528638" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.040236 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.223877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74rgk\" (UniqueName: \"kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk\") pod \"a10aa4ea-573d-4956-953f-4bdef827448d\" (UID: \"a10aa4ea-573d-4956-953f-4bdef827448d\") " Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.228819 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk" (OuterVolumeSpecName: "kube-api-access-74rgk") pod "a10aa4ea-573d-4956-953f-4bdef827448d" (UID: "a10aa4ea-573d-4956-953f-4bdef827448d"). InnerVolumeSpecName "kube-api-access-74rgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.326906 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74rgk\" (UniqueName: \"kubernetes.io/projected/a10aa4ea-573d-4956-953f-4bdef827448d-kube-api-access-74rgk\") on node \"crc\" DevicePath \"\"" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.550537 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-cgc65" event={"ID":"a10aa4ea-573d-4956-953f-4bdef827448d","Type":"ContainerDied","Data":"9adcc651841abf3955ab90d6ae894f7a37101852ea93099acf016628a334ccef"} Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.550599 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9adcc651841abf3955ab90d6ae894f7a37101852ea93099acf016628a334ccef" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.550556 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-cgc65" Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.609059 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-c776t"] Mar 18 10:24:06 crc kubenswrapper[4778]: I0318 10:24:06.616989 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-c776t"] Mar 18 10:24:08 crc kubenswrapper[4778]: I0318 10:24:08.201814 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b16b969-ac86-4725-910d-797cd1faedc9" path="/var/lib/kubelet/pods/9b16b969-ac86-4725-910d-797cd1faedc9/volumes" Mar 18 10:24:11 crc kubenswrapper[4778]: I0318 10:24:11.376816 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:11 crc kubenswrapper[4778]: I0318 10:24:11.377070 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:11 crc kubenswrapper[4778]: I0318 10:24:11.425679 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:11 crc kubenswrapper[4778]: I0318 10:24:11.663553 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:12 crc kubenswrapper[4778]: I0318 10:24:12.808015 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:13 crc kubenswrapper[4778]: I0318 10:24:13.631285 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5h69g" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="registry-server" containerID="cri-o://98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383" gracePeriod=2 Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.605926 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.643524 4778 generic.go:334] "Generic (PLEG): container finished" podID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerID="98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383" exitCode=0 Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.643562 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerDied","Data":"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383"} Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.643589 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5h69g" event={"ID":"2ac74550-1228-4f02-a1fc-9816bc63eb22","Type":"ContainerDied","Data":"9ffd33c3c22cf4a977b61b12575074605267705c1e086f0ed6e287067b8b808d"} Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.643605 4778 scope.go:117] "RemoveContainer" containerID="98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.643718 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5h69g" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.673983 4778 scope.go:117] "RemoveContainer" containerID="58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.692019 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74tfk\" (UniqueName: \"kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk\") pod \"2ac74550-1228-4f02-a1fc-9816bc63eb22\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.692317 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content\") pod \"2ac74550-1228-4f02-a1fc-9816bc63eb22\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.692359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities\") pod \"2ac74550-1228-4f02-a1fc-9816bc63eb22\" (UID: \"2ac74550-1228-4f02-a1fc-9816bc63eb22\") " Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.693718 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities" (OuterVolumeSpecName: "utilities") pod "2ac74550-1228-4f02-a1fc-9816bc63eb22" (UID: "2ac74550-1228-4f02-a1fc-9816bc63eb22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.701600 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk" (OuterVolumeSpecName: "kube-api-access-74tfk") pod "2ac74550-1228-4f02-a1fc-9816bc63eb22" (UID: "2ac74550-1228-4f02-a1fc-9816bc63eb22"). InnerVolumeSpecName "kube-api-access-74tfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.715234 4778 scope.go:117] "RemoveContainer" containerID="cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.790782 4778 scope.go:117] "RemoveContainer" containerID="98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383" Mar 18 10:24:14 crc kubenswrapper[4778]: E0318 10:24:14.791678 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383\": container with ID starting with 98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383 not found: ID does not exist" containerID="98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.791711 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383"} err="failed to get container status \"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383\": rpc error: code = NotFound desc = could not find container \"98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383\": container with ID starting with 98d3da1e8d6326ee2e919ffef2ee3172b31c4dfb5f2d801dea5a8498fe629383 not found: ID does not exist" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.791738 4778 scope.go:117] "RemoveContainer" containerID="58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a" Mar 18 10:24:14 crc kubenswrapper[4778]: E0318 10:24:14.792162 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a\": container with ID starting with 58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a not found: ID does not exist" containerID="58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.792215 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a"} err="failed to get container status \"58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a\": rpc error: code = NotFound desc = could not find container \"58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a\": container with ID starting with 58694ba950df27c29e8161bd04acc5eb0fb1dc1dfcfe83e74448b8c286fff53a not found: ID does not exist" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.792241 4778 scope.go:117] "RemoveContainer" containerID="cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304" Mar 18 10:24:14 crc kubenswrapper[4778]: E0318 10:24:14.792494 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304\": container with ID starting with cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304 not found: ID does not exist" containerID="cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.792546 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304"} err="failed to get container status \"cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304\": rpc error: code = NotFound desc = could not find container \"cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304\": container with ID starting with cb460d1371ac1b069162ea34dd704e60b5fef5172b358109e8c8f9fe12746304 not found: ID does not exist" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.795547 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.795595 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74tfk\" (UniqueName: \"kubernetes.io/projected/2ac74550-1228-4f02-a1fc-9816bc63eb22-kube-api-access-74tfk\") on node \"crc\" DevicePath \"\"" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.971950 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ac74550-1228-4f02-a1fc-9816bc63eb22" (UID: "2ac74550-1228-4f02-a1fc-9816bc63eb22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:24:14 crc kubenswrapper[4778]: I0318 10:24:14.999190 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac74550-1228-4f02-a1fc-9816bc63eb22-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:24:15 crc kubenswrapper[4778]: I0318 10:24:15.295391 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:15 crc kubenswrapper[4778]: I0318 10:24:15.309243 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5h69g"] Mar 18 10:24:16 crc kubenswrapper[4778]: I0318 10:24:16.219027 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" path="/var/lib/kubelet/pods/2ac74550-1228-4f02-a1fc-9816bc63eb22/volumes" Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.147118 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.147608 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.147649 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.148402 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.148462 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215" gracePeriod=600 Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.791386 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215" exitCode=0 Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.791459 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215"} Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.791851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36"} Mar 18 10:24:30 crc kubenswrapper[4778]: I0318 10:24:30.791877 4778 scope.go:117] "RemoveContainer" containerID="f9b48b7a44c062075536516513ee2ea91b811eafa68a0795c6ee2580b25334ce" Mar 18 10:24:35 crc kubenswrapper[4778]: I0318 10:24:35.249186 4778 scope.go:117] "RemoveContainer" containerID="ecf9caf224b383664e625024d70d285619a974017bb319f2a60a8627b5e0d68b" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.151185 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563826-w6vsk"] Mar 18 10:26:00 crc kubenswrapper[4778]: E0318 10:26:00.152063 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="extract-utilities" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152075 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="extract-utilities" Mar 18 10:26:00 crc kubenswrapper[4778]: E0318 10:26:00.152094 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10aa4ea-573d-4956-953f-4bdef827448d" containerName="oc" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152099 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10aa4ea-573d-4956-953f-4bdef827448d" containerName="oc" Mar 18 10:26:00 crc kubenswrapper[4778]: E0318 10:26:00.152120 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="registry-server" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152127 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="registry-server" Mar 18 10:26:00 crc kubenswrapper[4778]: E0318 10:26:00.152142 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="extract-content" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152151 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="extract-content" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152349 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10aa4ea-573d-4956-953f-4bdef827448d" containerName="oc" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.152373 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac74550-1228-4f02-a1fc-9816bc63eb22" containerName="registry-server" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.153014 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.156608 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.158595 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.159511 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.160668 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-w6vsk"] Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.292386 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbk24\" (UniqueName: \"kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24\") pod \"auto-csr-approver-29563826-w6vsk\" (UID: \"16b96d84-1d96-4b9b-b266-522602e5000d\") " pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.394675 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbk24\" (UniqueName: \"kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24\") pod \"auto-csr-approver-29563826-w6vsk\" (UID: \"16b96d84-1d96-4b9b-b266-522602e5000d\") " pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.415270 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbk24\" (UniqueName: \"kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24\") pod \"auto-csr-approver-29563826-w6vsk\" (UID: \"16b96d84-1d96-4b9b-b266-522602e5000d\") " pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.474017 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:00 crc kubenswrapper[4778]: I0318 10:26:00.942240 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-w6vsk"] Mar 18 10:26:01 crc kubenswrapper[4778]: I0318 10:26:01.668574 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" event={"ID":"16b96d84-1d96-4b9b-b266-522602e5000d","Type":"ContainerStarted","Data":"3f35e7604c6ec1277d4a92b161011db1bc29d7c04c76f7cef434cc263c42ef3c"} Mar 18 10:26:03 crc kubenswrapper[4778]: I0318 10:26:03.689706 4778 generic.go:334] "Generic (PLEG): container finished" podID="16b96d84-1d96-4b9b-b266-522602e5000d" containerID="53ff744bef01ec78e09ff6d04137bcd8e4bacfa9ad131e28bbc23695a24879a8" exitCode=0 Mar 18 10:26:03 crc kubenswrapper[4778]: I0318 10:26:03.689773 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" event={"ID":"16b96d84-1d96-4b9b-b266-522602e5000d","Type":"ContainerDied","Data":"53ff744bef01ec78e09ff6d04137bcd8e4bacfa9ad131e28bbc23695a24879a8"} Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.249793 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.388479 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbk24\" (UniqueName: \"kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24\") pod \"16b96d84-1d96-4b9b-b266-522602e5000d\" (UID: \"16b96d84-1d96-4b9b-b266-522602e5000d\") " Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.401251 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24" (OuterVolumeSpecName: "kube-api-access-vbk24") pod "16b96d84-1d96-4b9b-b266-522602e5000d" (UID: "16b96d84-1d96-4b9b-b266-522602e5000d"). InnerVolumeSpecName "kube-api-access-vbk24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.490640 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbk24\" (UniqueName: \"kubernetes.io/projected/16b96d84-1d96-4b9b-b266-522602e5000d-kube-api-access-vbk24\") on node \"crc\" DevicePath \"\"" Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.713540 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" event={"ID":"16b96d84-1d96-4b9b-b266-522602e5000d","Type":"ContainerDied","Data":"3f35e7604c6ec1277d4a92b161011db1bc29d7c04c76f7cef434cc263c42ef3c"} Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.713827 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f35e7604c6ec1277d4a92b161011db1bc29d7c04c76f7cef434cc263c42ef3c" Mar 18 10:26:05 crc kubenswrapper[4778]: I0318 10:26:05.713593 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-w6vsk" Mar 18 10:26:06 crc kubenswrapper[4778]: I0318 10:26:06.340412 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-xjqs9"] Mar 18 10:26:06 crc kubenswrapper[4778]: I0318 10:26:06.348298 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-xjqs9"] Mar 18 10:26:08 crc kubenswrapper[4778]: I0318 10:26:08.205749 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b4b190c-f80e-4256-9025-f04279c3b3db" path="/var/lib/kubelet/pods/0b4b190c-f80e-4256-9025-f04279c3b3db/volumes" Mar 18 10:26:30 crc kubenswrapper[4778]: I0318 10:26:30.147872 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:26:30 crc kubenswrapper[4778]: I0318 10:26:30.148409 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:26:35 crc kubenswrapper[4778]: I0318 10:26:35.359172 4778 scope.go:117] "RemoveContainer" containerID="f18fd314b43b902914cecf41007b7ceab63380321a7a2e1eeb69b1d4c6a07a98" Mar 18 10:27:00 crc kubenswrapper[4778]: I0318 10:27:00.148113 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:27:00 crc kubenswrapper[4778]: I0318 10:27:00.148818 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.147608 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.148308 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.148372 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.149532 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.149646 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" gracePeriod=600 Mar 18 10:27:30 crc kubenswrapper[4778]: E0318 10:27:30.285333 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.481147 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" exitCode=0 Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.481704 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36"} Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.481761 4778 scope.go:117] "RemoveContainer" containerID="4b8f66334209f29f6c78b08a10e261ea310c6e7d034012d49ee4adbb016d4215" Mar 18 10:27:30 crc kubenswrapper[4778]: I0318 10:27:30.482732 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:27:30 crc kubenswrapper[4778]: E0318 10:27:30.483067 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:27:41 crc kubenswrapper[4778]: I0318 10:27:41.188519 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:27:41 crc kubenswrapper[4778]: E0318 10:27:41.189954 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.556641 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:27:44 crc kubenswrapper[4778]: E0318 10:27:44.557454 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b96d84-1d96-4b9b-b266-522602e5000d" containerName="oc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.557466 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b96d84-1d96-4b9b-b266-522602e5000d" containerName="oc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.557671 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b96d84-1d96-4b9b-b266-522602e5000d" containerName="oc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.558998 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.565914 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.566035 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7s2s\" (UniqueName: \"kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.566141 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.570648 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.667235 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.667588 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7s2s\" (UniqueName: \"kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.667651 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.667937 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.667988 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.691186 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7s2s\" (UniqueName: \"kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s\") pod \"redhat-operators-w2ksc\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:44 crc kubenswrapper[4778]: I0318 10:27:44.908431 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:45 crc kubenswrapper[4778]: I0318 10:27:45.424608 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:27:45 crc kubenswrapper[4778]: I0318 10:27:45.640048 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerStarted","Data":"28a1912ae4357de244cb7f9f01d8d2ea6f7fb0931ecc2ea21316f9938d55ba8c"} Mar 18 10:27:46 crc kubenswrapper[4778]: I0318 10:27:46.650301 4778 generic.go:334] "Generic (PLEG): container finished" podID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerID="ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97" exitCode=0 Mar 18 10:27:46 crc kubenswrapper[4778]: I0318 10:27:46.650392 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerDied","Data":"ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97"} Mar 18 10:27:46 crc kubenswrapper[4778]: I0318 10:27:46.656273 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:27:48 crc kubenswrapper[4778]: I0318 10:27:48.694640 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerStarted","Data":"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc"} Mar 18 10:27:52 crc kubenswrapper[4778]: I0318 10:27:52.736049 4778 generic.go:334] "Generic (PLEG): container finished" podID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerID="295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc" exitCode=0 Mar 18 10:27:52 crc kubenswrapper[4778]: I0318 10:27:52.736111 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerDied","Data":"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc"} Mar 18 10:27:53 crc kubenswrapper[4778]: I0318 10:27:53.188315 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:27:53 crc kubenswrapper[4778]: E0318 10:27:53.189562 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:27:53 crc kubenswrapper[4778]: I0318 10:27:53.746024 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerStarted","Data":"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24"} Mar 18 10:27:53 crc kubenswrapper[4778]: I0318 10:27:53.778480 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w2ksc" podStartSLOduration=3.312462954 podStartE2EDuration="9.778462744s" podCreationTimestamp="2026-03-18 10:27:44 +0000 UTC" firstStartedPulling="2026-03-18 10:27:46.655659571 +0000 UTC m=+5133.230404451" lastFinishedPulling="2026-03-18 10:27:53.121659401 +0000 UTC m=+5139.696404241" observedRunningTime="2026-03-18 10:27:53.771154475 +0000 UTC m=+5140.345899305" watchObservedRunningTime="2026-03-18 10:27:53.778462744 +0000 UTC m=+5140.353207574" Mar 18 10:27:54 crc kubenswrapper[4778]: I0318 10:27:54.908957 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:54 crc kubenswrapper[4778]: I0318 10:27:54.909314 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:27:55 crc kubenswrapper[4778]: I0318 10:27:55.963288 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2ksc" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" probeResult="failure" output=< Mar 18 10:27:55 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:27:55 crc kubenswrapper[4778]: > Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.141318 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563828-bg8zw"] Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.143179 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.145153 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.148058 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.148123 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.152859 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-bg8zw"] Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.194429 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsmgf\" (UniqueName: \"kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf\") pod \"auto-csr-approver-29563828-bg8zw\" (UID: \"6b507e48-f0a9-4938-ad91-298a6f90aad1\") " pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.297593 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsmgf\" (UniqueName: \"kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf\") pod \"auto-csr-approver-29563828-bg8zw\" (UID: \"6b507e48-f0a9-4938-ad91-298a6f90aad1\") " pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.320620 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsmgf\" (UniqueName: \"kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf\") pod \"auto-csr-approver-29563828-bg8zw\" (UID: \"6b507e48-f0a9-4938-ad91-298a6f90aad1\") " pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:00 crc kubenswrapper[4778]: I0318 10:28:00.493749 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:01 crc kubenswrapper[4778]: I0318 10:28:01.023854 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-bg8zw"] Mar 18 10:28:01 crc kubenswrapper[4778]: I0318 10:28:01.808270 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" event={"ID":"6b507e48-f0a9-4938-ad91-298a6f90aad1","Type":"ContainerStarted","Data":"d393a87c854b4e694fede6b64fff4ff35037a5b637ac41f2b2a51acb65651b79"} Mar 18 10:28:02 crc kubenswrapper[4778]: I0318 10:28:02.820969 4778 generic.go:334] "Generic (PLEG): container finished" podID="6b507e48-f0a9-4938-ad91-298a6f90aad1" containerID="37abf2b00f0327218ef392055e69e5b8b65d6c2f27975137e66e86722fc34dea" exitCode=0 Mar 18 10:28:02 crc kubenswrapper[4778]: I0318 10:28:02.821179 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" event={"ID":"6b507e48-f0a9-4938-ad91-298a6f90aad1","Type":"ContainerDied","Data":"37abf2b00f0327218ef392055e69e5b8b65d6c2f27975137e66e86722fc34dea"} Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.360328 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.368100 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsmgf\" (UniqueName: \"kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf\") pod \"6b507e48-f0a9-4938-ad91-298a6f90aad1\" (UID: \"6b507e48-f0a9-4938-ad91-298a6f90aad1\") " Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.374364 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf" (OuterVolumeSpecName: "kube-api-access-rsmgf") pod "6b507e48-f0a9-4938-ad91-298a6f90aad1" (UID: "6b507e48-f0a9-4938-ad91-298a6f90aad1"). InnerVolumeSpecName "kube-api-access-rsmgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.470064 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsmgf\" (UniqueName: \"kubernetes.io/projected/6b507e48-f0a9-4938-ad91-298a6f90aad1-kube-api-access-rsmgf\") on node \"crc\" DevicePath \"\"" Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.842510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" event={"ID":"6b507e48-f0a9-4938-ad91-298a6f90aad1","Type":"ContainerDied","Data":"d393a87c854b4e694fede6b64fff4ff35037a5b637ac41f2b2a51acb65651b79"} Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.843666 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d393a87c854b4e694fede6b64fff4ff35037a5b637ac41f2b2a51acb65651b79" Mar 18 10:28:04 crc kubenswrapper[4778]: I0318 10:28:04.842579 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-bg8zw" Mar 18 10:28:05 crc kubenswrapper[4778]: I0318 10:28:05.441371 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-hg65s"] Mar 18 10:28:05 crc kubenswrapper[4778]: I0318 10:28:05.456160 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-hg65s"] Mar 18 10:28:05 crc kubenswrapper[4778]: I0318 10:28:05.970921 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2ksc" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" probeResult="failure" output=< Mar 18 10:28:05 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:28:05 crc kubenswrapper[4778]: > Mar 18 10:28:06 crc kubenswrapper[4778]: I0318 10:28:06.187008 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:28:06 crc kubenswrapper[4778]: E0318 10:28:06.187348 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:28:06 crc kubenswrapper[4778]: I0318 10:28:06.214359 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b71bd8a1-ed53-4e72-8316-7bf3774ee1d8" path="/var/lib/kubelet/pods/b71bd8a1-ed53-4e72-8316-7bf3774ee1d8/volumes" Mar 18 10:28:15 crc kubenswrapper[4778]: I0318 10:28:15.978968 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2ksc" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" probeResult="failure" output=< Mar 18 10:28:15 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:28:15 crc kubenswrapper[4778]: > Mar 18 10:28:18 crc kubenswrapper[4778]: I0318 10:28:18.187874 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:28:18 crc kubenswrapper[4778]: E0318 10:28:18.188377 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:28:25 crc kubenswrapper[4778]: I0318 10:28:25.966071 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w2ksc" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" probeResult="failure" output=< Mar 18 10:28:25 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:28:25 crc kubenswrapper[4778]: > Mar 18 10:28:30 crc kubenswrapper[4778]: I0318 10:28:30.186911 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:28:30 crc kubenswrapper[4778]: E0318 10:28:30.187626 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:28:34 crc kubenswrapper[4778]: I0318 10:28:34.963378 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:28:35 crc kubenswrapper[4778]: I0318 10:28:35.025614 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:28:35 crc kubenswrapper[4778]: I0318 10:28:35.199593 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:28:35 crc kubenswrapper[4778]: I0318 10:28:35.453512 4778 scope.go:117] "RemoveContainer" containerID="23fd2e1aeae3db5eb1358b08c80f32e6a7b3195affe83ad0ba4169976ea65d12" Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.094562 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w2ksc" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" containerID="cri-o://f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24" gracePeriod=2 Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.761267 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.922667 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities\") pod \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.922722 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content\") pod \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.922755 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7s2s\" (UniqueName: \"kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s\") pod \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\" (UID: \"b36149c6-492e-4fdd-8955-9b0ca1ab902c\") " Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.924963 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities" (OuterVolumeSpecName: "utilities") pod "b36149c6-492e-4fdd-8955-9b0ca1ab902c" (UID: "b36149c6-492e-4fdd-8955-9b0ca1ab902c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:28:36 crc kubenswrapper[4778]: I0318 10:28:36.936752 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s" (OuterVolumeSpecName: "kube-api-access-d7s2s") pod "b36149c6-492e-4fdd-8955-9b0ca1ab902c" (UID: "b36149c6-492e-4fdd-8955-9b0ca1ab902c"). InnerVolumeSpecName "kube-api-access-d7s2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.025118 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.025159 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7s2s\" (UniqueName: \"kubernetes.io/projected/b36149c6-492e-4fdd-8955-9b0ca1ab902c-kube-api-access-d7s2s\") on node \"crc\" DevicePath \"\"" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.058651 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b36149c6-492e-4fdd-8955-9b0ca1ab902c" (UID: "b36149c6-492e-4fdd-8955-9b0ca1ab902c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.105099 4778 generic.go:334] "Generic (PLEG): container finished" podID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerID="f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24" exitCode=0 Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.105160 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w2ksc" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.105175 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerDied","Data":"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24"} Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.106374 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w2ksc" event={"ID":"b36149c6-492e-4fdd-8955-9b0ca1ab902c","Type":"ContainerDied","Data":"28a1912ae4357de244cb7f9f01d8d2ea6f7fb0931ecc2ea21316f9938d55ba8c"} Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.106409 4778 scope.go:117] "RemoveContainer" containerID="f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.127149 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b36149c6-492e-4fdd-8955-9b0ca1ab902c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.136855 4778 scope.go:117] "RemoveContainer" containerID="295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.140912 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.151268 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w2ksc"] Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.164618 4778 scope.go:117] "RemoveContainer" containerID="ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.212261 4778 scope.go:117] "RemoveContainer" containerID="f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24" Mar 18 10:28:37 crc kubenswrapper[4778]: E0318 10:28:37.212761 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24\": container with ID starting with f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24 not found: ID does not exist" containerID="f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.212809 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24"} err="failed to get container status \"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24\": rpc error: code = NotFound desc = could not find container \"f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24\": container with ID starting with f7db5d6d47af63edf0333ac0fce98640a00b1fe3568b75b384f15a2addffaf24 not found: ID does not exist" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.212832 4778 scope.go:117] "RemoveContainer" containerID="295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc" Mar 18 10:28:37 crc kubenswrapper[4778]: E0318 10:28:37.213255 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc\": container with ID starting with 295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc not found: ID does not exist" containerID="295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.213385 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc"} err="failed to get container status \"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc\": rpc error: code = NotFound desc = could not find container \"295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc\": container with ID starting with 295028ff8335c093b455d1f3cfd9d2773ecd4262c90afd10aec701fee219a3bc not found: ID does not exist" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.213473 4778 scope.go:117] "RemoveContainer" containerID="ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97" Mar 18 10:28:37 crc kubenswrapper[4778]: E0318 10:28:37.213820 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97\": container with ID starting with ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97 not found: ID does not exist" containerID="ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97" Mar 18 10:28:37 crc kubenswrapper[4778]: I0318 10:28:37.213846 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97"} err="failed to get container status \"ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97\": rpc error: code = NotFound desc = could not find container \"ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97\": container with ID starting with ae54136036920cde71cd9305097ce5dc4f4fd9e440b88b1a7300bb61b2ab7d97 not found: ID does not exist" Mar 18 10:28:38 crc kubenswrapper[4778]: I0318 10:28:38.195722 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" path="/var/lib/kubelet/pods/b36149c6-492e-4fdd-8955-9b0ca1ab902c/volumes" Mar 18 10:28:41 crc kubenswrapper[4778]: I0318 10:28:41.188780 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:28:41 crc kubenswrapper[4778]: E0318 10:28:41.190935 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:28:55 crc kubenswrapper[4778]: I0318 10:28:55.187620 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:28:55 crc kubenswrapper[4778]: E0318 10:28:55.188929 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:29:11 crc kubenswrapper[4778]: I0318 10:29:10.187233 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:29:11 crc kubenswrapper[4778]: E0318 10:29:10.188275 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:29:25 crc kubenswrapper[4778]: I0318 10:29:25.186951 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:29:25 crc kubenswrapper[4778]: E0318 10:29:25.188997 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:29:36 crc kubenswrapper[4778]: I0318 10:29:36.190992 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:29:36 crc kubenswrapper[4778]: E0318 10:29:36.191664 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:29:47 crc kubenswrapper[4778]: I0318 10:29:47.188133 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:29:47 crc kubenswrapper[4778]: E0318 10:29:47.189071 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:29:58 crc kubenswrapper[4778]: I0318 10:29:58.187806 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:29:58 crc kubenswrapper[4778]: E0318 10:29:58.188707 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.153831 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563830-cmbbv"] Mar 18 10:30:00 crc kubenswrapper[4778]: E0318 10:30:00.154897 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="extract-content" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.154914 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="extract-content" Mar 18 10:30:00 crc kubenswrapper[4778]: E0318 10:30:00.154925 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="extract-utilities" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.154931 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="extract-utilities" Mar 18 10:30:00 crc kubenswrapper[4778]: E0318 10:30:00.154946 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.154951 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" Mar 18 10:30:00 crc kubenswrapper[4778]: E0318 10:30:00.154975 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b507e48-f0a9-4938-ad91-298a6f90aad1" containerName="oc" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.154981 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b507e48-f0a9-4938-ad91-298a6f90aad1" containerName="oc" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.155162 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36149c6-492e-4fdd-8955-9b0ca1ab902c" containerName="registry-server" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.155185 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b507e48-f0a9-4938-ad91-298a6f90aad1" containerName="oc" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.155861 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.160838 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.161088 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.161235 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.166435 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m"] Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.167883 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.170391 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.171720 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.178056 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-cmbbv"] Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.211526 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m"] Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.234898 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.235069 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdww\" (UniqueName: \"kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.235238 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9j7b\" (UniqueName: \"kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b\") pod \"auto-csr-approver-29563830-cmbbv\" (UID: \"020a579d-1395-4039-8c3a-7454709e9af6\") " pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.235463 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.337619 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.337695 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdww\" (UniqueName: \"kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.337762 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9j7b\" (UniqueName: \"kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b\") pod \"auto-csr-approver-29563830-cmbbv\" (UID: \"020a579d-1395-4039-8c3a-7454709e9af6\") " pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.337835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.338706 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.351372 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.353382 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdww\" (UniqueName: \"kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww\") pod \"collect-profiles-29563830-9w24m\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.354033 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9j7b\" (UniqueName: \"kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b\") pod \"auto-csr-approver-29563830-cmbbv\" (UID: \"020a579d-1395-4039-8c3a-7454709e9af6\") " pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.481333 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:00 crc kubenswrapper[4778]: I0318 10:30:00.490869 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.077869 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m"] Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.243402 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-cmbbv"] Mar 18 10:30:01 crc kubenswrapper[4778]: W0318 10:30:01.245844 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod020a579d_1395_4039_8c3a_7454709e9af6.slice/crio-4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74 WatchSource:0}: Error finding container 4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74: Status 404 returned error can't find the container with id 4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74 Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.843182 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" event={"ID":"020a579d-1395-4039-8c3a-7454709e9af6","Type":"ContainerStarted","Data":"4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74"} Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.845008 4778 generic.go:334] "Generic (PLEG): container finished" podID="688101ed-133b-42c6-87f0-fb2ce2afa33f" containerID="91439ddaf1c7b64a7912887de697803bd3f4ff4a97a1ee187c7b7ad2914b7556" exitCode=0 Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.845058 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" event={"ID":"688101ed-133b-42c6-87f0-fb2ce2afa33f","Type":"ContainerDied","Data":"91439ddaf1c7b64a7912887de697803bd3f4ff4a97a1ee187c7b7ad2914b7556"} Mar 18 10:30:01 crc kubenswrapper[4778]: I0318 10:30:01.845101 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" event={"ID":"688101ed-133b-42c6-87f0-fb2ce2afa33f","Type":"ContainerStarted","Data":"4c26e59735d7c3f2f271c0eb690c58016a6af6de5508e76c1d4af772c4103d48"} Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.416909 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.516420 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume\") pod \"688101ed-133b-42c6-87f0-fb2ce2afa33f\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.516801 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume\") pod \"688101ed-133b-42c6-87f0-fb2ce2afa33f\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.516988 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zdww\" (UniqueName: \"kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww\") pod \"688101ed-133b-42c6-87f0-fb2ce2afa33f\" (UID: \"688101ed-133b-42c6-87f0-fb2ce2afa33f\") " Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.519599 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume" (OuterVolumeSpecName: "config-volume") pod "688101ed-133b-42c6-87f0-fb2ce2afa33f" (UID: "688101ed-133b-42c6-87f0-fb2ce2afa33f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.523311 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "688101ed-133b-42c6-87f0-fb2ce2afa33f" (UID: "688101ed-133b-42c6-87f0-fb2ce2afa33f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.523600 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww" (OuterVolumeSpecName: "kube-api-access-9zdww") pod "688101ed-133b-42c6-87f0-fb2ce2afa33f" (UID: "688101ed-133b-42c6-87f0-fb2ce2afa33f"). InnerVolumeSpecName "kube-api-access-9zdww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.619530 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zdww\" (UniqueName: \"kubernetes.io/projected/688101ed-133b-42c6-87f0-fb2ce2afa33f-kube-api-access-9zdww\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.619568 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/688101ed-133b-42c6-87f0-fb2ce2afa33f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.619576 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/688101ed-133b-42c6-87f0-fb2ce2afa33f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.863006 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.862993 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m" event={"ID":"688101ed-133b-42c6-87f0-fb2ce2afa33f","Type":"ContainerDied","Data":"4c26e59735d7c3f2f271c0eb690c58016a6af6de5508e76c1d4af772c4103d48"} Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.863138 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c26e59735d7c3f2f271c0eb690c58016a6af6de5508e76c1d4af772c4103d48" Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.864546 4778 generic.go:334] "Generic (PLEG): container finished" podID="020a579d-1395-4039-8c3a-7454709e9af6" containerID="e298907ce2b631ab1e3060efd7429ad70a3f2d93551c33b2a088ad16a12f01ae" exitCode=0 Mar 18 10:30:03 crc kubenswrapper[4778]: I0318 10:30:03.864670 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" event={"ID":"020a579d-1395-4039-8c3a-7454709e9af6","Type":"ContainerDied","Data":"e298907ce2b631ab1e3060efd7429ad70a3f2d93551c33b2a088ad16a12f01ae"} Mar 18 10:30:04 crc kubenswrapper[4778]: I0318 10:30:04.500508 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq"] Mar 18 10:30:04 crc kubenswrapper[4778]: I0318 10:30:04.508100 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563785-ptknq"] Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.432363 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.557082 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9j7b\" (UniqueName: \"kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b\") pod \"020a579d-1395-4039-8c3a-7454709e9af6\" (UID: \"020a579d-1395-4039-8c3a-7454709e9af6\") " Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.567983 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b" (OuterVolumeSpecName: "kube-api-access-g9j7b") pod "020a579d-1395-4039-8c3a-7454709e9af6" (UID: "020a579d-1395-4039-8c3a-7454709e9af6"). InnerVolumeSpecName "kube-api-access-g9j7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.660455 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9j7b\" (UniqueName: \"kubernetes.io/projected/020a579d-1395-4039-8c3a-7454709e9af6-kube-api-access-g9j7b\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.882891 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" event={"ID":"020a579d-1395-4039-8c3a-7454709e9af6","Type":"ContainerDied","Data":"4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74"} Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.882931 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a7633c8556695ef997523713bddc0f5f3c863a3f390cbe59df35a4295532c74" Mar 18 10:30:05 crc kubenswrapper[4778]: I0318 10:30:05.882964 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-cmbbv" Mar 18 10:30:06 crc kubenswrapper[4778]: I0318 10:30:06.198554 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956ed194-df94-4b74-919f-9cdcfbdcf5a7" path="/var/lib/kubelet/pods/956ed194-df94-4b74-919f-9cdcfbdcf5a7/volumes" Mar 18 10:30:06 crc kubenswrapper[4778]: I0318 10:30:06.491150 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-cgc65"] Mar 18 10:30:06 crc kubenswrapper[4778]: I0318 10:30:06.498668 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-cgc65"] Mar 18 10:30:08 crc kubenswrapper[4778]: I0318 10:30:08.196685 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a10aa4ea-573d-4956-953f-4bdef827448d" path="/var/lib/kubelet/pods/a10aa4ea-573d-4956-953f-4bdef827448d/volumes" Mar 18 10:30:11 crc kubenswrapper[4778]: I0318 10:30:11.187018 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:30:11 crc kubenswrapper[4778]: E0318 10:30:11.187861 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:30:26 crc kubenswrapper[4778]: I0318 10:30:26.187302 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:30:26 crc kubenswrapper[4778]: E0318 10:30:26.188186 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:30:35 crc kubenswrapper[4778]: I0318 10:30:35.568863 4778 scope.go:117] "RemoveContainer" containerID="5830290d694881dc2acd7c8637c4816f24221da7b703db7749adb1ec9ce95a1b" Mar 18 10:30:35 crc kubenswrapper[4778]: I0318 10:30:35.631863 4778 scope.go:117] "RemoveContainer" containerID="b6a6fd51a98d9937da03ae4682cc5b4ae715e8495f9ae8fc3459feb811d9d2fc" Mar 18 10:30:37 crc kubenswrapper[4778]: I0318 10:30:37.189123 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:30:37 crc kubenswrapper[4778]: E0318 10:30:37.189943 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:30:49 crc kubenswrapper[4778]: I0318 10:30:49.187248 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:30:49 crc kubenswrapper[4778]: E0318 10:30:49.187899 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:31:03 crc kubenswrapper[4778]: I0318 10:31:03.188559 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:31:03 crc kubenswrapper[4778]: E0318 10:31:03.189466 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:31:16 crc kubenswrapper[4778]: I0318 10:31:16.187781 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:31:16 crc kubenswrapper[4778]: E0318 10:31:16.188564 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:31:31 crc kubenswrapper[4778]: I0318 10:31:31.187610 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:31:31 crc kubenswrapper[4778]: E0318 10:31:31.188413 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:31:42 crc kubenswrapper[4778]: I0318 10:31:42.187401 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:31:42 crc kubenswrapper[4778]: E0318 10:31:42.189336 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:31:53 crc kubenswrapper[4778]: I0318 10:31:53.188092 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:31:53 crc kubenswrapper[4778]: E0318 10:31:53.189274 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.155496 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563832-8vv9t"] Mar 18 10:32:00 crc kubenswrapper[4778]: E0318 10:32:00.156759 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020a579d-1395-4039-8c3a-7454709e9af6" containerName="oc" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.156783 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="020a579d-1395-4039-8c3a-7454709e9af6" containerName="oc" Mar 18 10:32:00 crc kubenswrapper[4778]: E0318 10:32:00.156847 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688101ed-133b-42c6-87f0-fb2ce2afa33f" containerName="collect-profiles" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.157052 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="688101ed-133b-42c6-87f0-fb2ce2afa33f" containerName="collect-profiles" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.157376 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="020a579d-1395-4039-8c3a-7454709e9af6" containerName="oc" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.157421 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="688101ed-133b-42c6-87f0-fb2ce2afa33f" containerName="collect-profiles" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.158627 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.163417 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.163980 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.165019 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.168797 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-8vv9t"] Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.300533 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q92v9\" (UniqueName: \"kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9\") pod \"auto-csr-approver-29563832-8vv9t\" (UID: \"19f52f16-1c49-4aa8-9e7b-10a9bf55e487\") " pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.403381 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q92v9\" (UniqueName: \"kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9\") pod \"auto-csr-approver-29563832-8vv9t\" (UID: \"19f52f16-1c49-4aa8-9e7b-10a9bf55e487\") " pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.439152 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q92v9\" (UniqueName: \"kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9\") pod \"auto-csr-approver-29563832-8vv9t\" (UID: \"19f52f16-1c49-4aa8-9e7b-10a9bf55e487\") " pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.488047 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:00 crc kubenswrapper[4778]: I0318 10:32:00.998915 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-8vv9t"] Mar 18 10:32:02 crc kubenswrapper[4778]: I0318 10:32:02.013776 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" event={"ID":"19f52f16-1c49-4aa8-9e7b-10a9bf55e487","Type":"ContainerStarted","Data":"1b386030482fe0667a16e9ddee22539810fc8e913e15ec7861949d47b76c20f2"} Mar 18 10:32:04 crc kubenswrapper[4778]: I0318 10:32:04.058426 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" event={"ID":"19f52f16-1c49-4aa8-9e7b-10a9bf55e487","Type":"ContainerDied","Data":"b3e45b3c111cb68776e6b0e92c1ba10a6ec66666310504f0fe918ad9a95b9a9e"} Mar 18 10:32:04 crc kubenswrapper[4778]: I0318 10:32:04.058272 4778 generic.go:334] "Generic (PLEG): container finished" podID="19f52f16-1c49-4aa8-9e7b-10a9bf55e487" containerID="b3e45b3c111cb68776e6b0e92c1ba10a6ec66666310504f0fe918ad9a95b9a9e" exitCode=0 Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.073689 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.082936 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" event={"ID":"19f52f16-1c49-4aa8-9e7b-10a9bf55e487","Type":"ContainerDied","Data":"1b386030482fe0667a16e9ddee22539810fc8e913e15ec7861949d47b76c20f2"} Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.082984 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b386030482fe0667a16e9ddee22539810fc8e913e15ec7861949d47b76c20f2" Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.083016 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-8vv9t" Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.187463 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:32:06 crc kubenswrapper[4778]: E0318 10:32:06.187800 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.245933 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q92v9\" (UniqueName: \"kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9\") pod \"19f52f16-1c49-4aa8-9e7b-10a9bf55e487\" (UID: \"19f52f16-1c49-4aa8-9e7b-10a9bf55e487\") " Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.251965 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9" (OuterVolumeSpecName: "kube-api-access-q92v9") pod "19f52f16-1c49-4aa8-9e7b-10a9bf55e487" (UID: "19f52f16-1c49-4aa8-9e7b-10a9bf55e487"). InnerVolumeSpecName "kube-api-access-q92v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:32:06 crc kubenswrapper[4778]: I0318 10:32:06.348192 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q92v9\" (UniqueName: \"kubernetes.io/projected/19f52f16-1c49-4aa8-9e7b-10a9bf55e487-kube-api-access-q92v9\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.098757 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:07 crc kubenswrapper[4778]: E0318 10:32:07.099277 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f52f16-1c49-4aa8-9e7b-10a9bf55e487" containerName="oc" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.099295 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f52f16-1c49-4aa8-9e7b-10a9bf55e487" containerName="oc" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.099582 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f52f16-1c49-4aa8-9e7b-10a9bf55e487" containerName="oc" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.101333 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.109880 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.167043 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.167412 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lshv6\" (UniqueName: \"kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.167665 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.174237 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-w6vsk"] Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.182572 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-w6vsk"] Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.269296 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.269361 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.269382 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lshv6\" (UniqueName: \"kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.270474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.270497 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.287895 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lshv6\" (UniqueName: \"kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6\") pod \"community-operators-2xqrs\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.439875 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:07 crc kubenswrapper[4778]: I0318 10:32:07.995635 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:07 crc kubenswrapper[4778]: W0318 10:32:07.998566 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85bd1932_7797_4e49_9b0c_a67b5176e03b.slice/crio-b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2 WatchSource:0}: Error finding container b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2: Status 404 returned error can't find the container with id b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2 Mar 18 10:32:08 crc kubenswrapper[4778]: I0318 10:32:08.110126 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerStarted","Data":"b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2"} Mar 18 10:32:08 crc kubenswrapper[4778]: I0318 10:32:08.204774 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b96d84-1d96-4b9b-b266-522602e5000d" path="/var/lib/kubelet/pods/16b96d84-1d96-4b9b-b266-522602e5000d/volumes" Mar 18 10:32:09 crc kubenswrapper[4778]: I0318 10:32:09.119788 4778 generic.go:334] "Generic (PLEG): container finished" podID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerID="d4ab5a2a9be3faa54895e3ea4c33073a1984cc4328299d0127067c77c01301e8" exitCode=0 Mar 18 10:32:09 crc kubenswrapper[4778]: I0318 10:32:09.119850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerDied","Data":"d4ab5a2a9be3faa54895e3ea4c33073a1984cc4328299d0127067c77c01301e8"} Mar 18 10:32:10 crc kubenswrapper[4778]: I0318 10:32:10.130911 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerStarted","Data":"7efa6734e40475f70dcca064e44946d2776507dea2029d7bc3055a574ac8a45a"} Mar 18 10:32:12 crc kubenswrapper[4778]: I0318 10:32:12.173504 4778 generic.go:334] "Generic (PLEG): container finished" podID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerID="7efa6734e40475f70dcca064e44946d2776507dea2029d7bc3055a574ac8a45a" exitCode=0 Mar 18 10:32:12 crc kubenswrapper[4778]: I0318 10:32:12.173797 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerDied","Data":"7efa6734e40475f70dcca064e44946d2776507dea2029d7bc3055a574ac8a45a"} Mar 18 10:32:13 crc kubenswrapper[4778]: I0318 10:32:13.186004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerStarted","Data":"0bc612bd31fa753f8d2f562fbc9812f902bbf84c5c3fafa32c598ca1eddc2ac0"} Mar 18 10:32:13 crc kubenswrapper[4778]: I0318 10:32:13.214934 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2xqrs" podStartSLOduration=2.645592533 podStartE2EDuration="6.214913633s" podCreationTimestamp="2026-03-18 10:32:07 +0000 UTC" firstStartedPulling="2026-03-18 10:32:09.122009163 +0000 UTC m=+5395.696754003" lastFinishedPulling="2026-03-18 10:32:12.691330263 +0000 UTC m=+5399.266075103" observedRunningTime="2026-03-18 10:32:13.202581628 +0000 UTC m=+5399.777326508" watchObservedRunningTime="2026-03-18 10:32:13.214913633 +0000 UTC m=+5399.789658483" Mar 18 10:32:17 crc kubenswrapper[4778]: I0318 10:32:17.440388 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:17 crc kubenswrapper[4778]: I0318 10:32:17.440999 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:18 crc kubenswrapper[4778]: I0318 10:32:18.513301 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-2xqrs" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="registry-server" probeResult="failure" output=< Mar 18 10:32:18 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:32:18 crc kubenswrapper[4778]: > Mar 18 10:32:21 crc kubenswrapper[4778]: I0318 10:32:21.186842 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:32:21 crc kubenswrapper[4778]: E0318 10:32:21.187657 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:32:27 crc kubenswrapper[4778]: I0318 10:32:27.489015 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:27 crc kubenswrapper[4778]: I0318 10:32:27.541946 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:27 crc kubenswrapper[4778]: I0318 10:32:27.741508 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:29 crc kubenswrapper[4778]: I0318 10:32:29.331552 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2xqrs" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="registry-server" containerID="cri-o://0bc612bd31fa753f8d2f562fbc9812f902bbf84c5c3fafa32c598ca1eddc2ac0" gracePeriod=2 Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.345004 4778 generic.go:334] "Generic (PLEG): container finished" podID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerID="0bc612bd31fa753f8d2f562fbc9812f902bbf84c5c3fafa32c598ca1eddc2ac0" exitCode=0 Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.345104 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerDied","Data":"0bc612bd31fa753f8d2f562fbc9812f902bbf84c5c3fafa32c598ca1eddc2ac0"} Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.346275 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xqrs" event={"ID":"85bd1932-7797-4e49-9b0c-a67b5176e03b","Type":"ContainerDied","Data":"b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2"} Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.346414 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b749541cd7e8e5266bf86b325dca76bccc806590445c6a2bf5c909ceffb2a2" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.427347 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.449435 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lshv6\" (UniqueName: \"kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6\") pod \"85bd1932-7797-4e49-9b0c-a67b5176e03b\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.449799 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities\") pod \"85bd1932-7797-4e49-9b0c-a67b5176e03b\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.449947 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content\") pod \"85bd1932-7797-4e49-9b0c-a67b5176e03b\" (UID: \"85bd1932-7797-4e49-9b0c-a67b5176e03b\") " Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.450799 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities" (OuterVolumeSpecName: "utilities") pod "85bd1932-7797-4e49-9b0c-a67b5176e03b" (UID: "85bd1932-7797-4e49-9b0c-a67b5176e03b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.451055 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.455437 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6" (OuterVolumeSpecName: "kube-api-access-lshv6") pod "85bd1932-7797-4e49-9b0c-a67b5176e03b" (UID: "85bd1932-7797-4e49-9b0c-a67b5176e03b"). InnerVolumeSpecName "kube-api-access-lshv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.512514 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85bd1932-7797-4e49-9b0c-a67b5176e03b" (UID: "85bd1932-7797-4e49-9b0c-a67b5176e03b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.553031 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lshv6\" (UniqueName: \"kubernetes.io/projected/85bd1932-7797-4e49-9b0c-a67b5176e03b-kube-api-access-lshv6\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:30 crc kubenswrapper[4778]: I0318 10:32:30.553078 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd1932-7797-4e49-9b0c-a67b5176e03b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:31 crc kubenswrapper[4778]: I0318 10:32:31.352877 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xqrs" Mar 18 10:32:31 crc kubenswrapper[4778]: I0318 10:32:31.393021 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:31 crc kubenswrapper[4778]: I0318 10:32:31.403453 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2xqrs"] Mar 18 10:32:32 crc kubenswrapper[4778]: I0318 10:32:32.197177 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" path="/var/lib/kubelet/pods/85bd1932-7797-4e49-9b0c-a67b5176e03b/volumes" Mar 18 10:32:34 crc kubenswrapper[4778]: I0318 10:32:34.203560 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:32:35 crc kubenswrapper[4778]: I0318 10:32:35.386022 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b"} Mar 18 10:32:35 crc kubenswrapper[4778]: I0318 10:32:35.730426 4778 scope.go:117] "RemoveContainer" containerID="53ff744bef01ec78e09ff6d04137bcd8e4bacfa9ad131e28bbc23695a24879a8" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.707159 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:36 crc kubenswrapper[4778]: E0318 10:33:36.709036 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="extract-content" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.709113 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="extract-content" Mar 18 10:33:36 crc kubenswrapper[4778]: E0318 10:33:36.709185 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="registry-server" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.709271 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="registry-server" Mar 18 10:33:36 crc kubenswrapper[4778]: E0318 10:33:36.709344 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="extract-utilities" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.709399 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="extract-utilities" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.709664 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="85bd1932-7797-4e49-9b0c-a67b5176e03b" containerName="registry-server" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.710984 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.728315 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.837586 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.837654 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29ngl\" (UniqueName: \"kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.837848 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.939469 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.939547 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.939581 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29ngl\" (UniqueName: \"kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.940010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.940084 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:36 crc kubenswrapper[4778]: I0318 10:33:36.962175 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29ngl\" (UniqueName: \"kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl\") pod \"certified-operators-6h72t\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.033012 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.626187 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.974420 4778 generic.go:334] "Generic (PLEG): container finished" podID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerID="24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b" exitCode=0 Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.974494 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerDied","Data":"24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b"} Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.974532 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerStarted","Data":"fdd38a00e590f73ce57630781877646092f03c72baf6f4ae18abcb06af9c3f8c"} Mar 18 10:33:37 crc kubenswrapper[4778]: I0318 10:33:37.977385 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:33:38 crc kubenswrapper[4778]: I0318 10:33:38.984494 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerStarted","Data":"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610"} Mar 18 10:33:41 crc kubenswrapper[4778]: I0318 10:33:41.012335 4778 generic.go:334] "Generic (PLEG): container finished" podID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerID="eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610" exitCode=0 Mar 18 10:33:41 crc kubenswrapper[4778]: I0318 10:33:41.012454 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerDied","Data":"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610"} Mar 18 10:33:43 crc kubenswrapper[4778]: I0318 10:33:43.034851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerStarted","Data":"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0"} Mar 18 10:33:43 crc kubenswrapper[4778]: I0318 10:33:43.053654 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6h72t" podStartSLOduration=3.000627977 podStartE2EDuration="7.053635076s" podCreationTimestamp="2026-03-18 10:33:36 +0000 UTC" firstStartedPulling="2026-03-18 10:33:37.977075228 +0000 UTC m=+5484.551820068" lastFinishedPulling="2026-03-18 10:33:42.030082327 +0000 UTC m=+5488.604827167" observedRunningTime="2026-03-18 10:33:43.052605628 +0000 UTC m=+5489.627350498" watchObservedRunningTime="2026-03-18 10:33:43.053635076 +0000 UTC m=+5489.628379926" Mar 18 10:33:47 crc kubenswrapper[4778]: I0318 10:33:47.033254 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:47 crc kubenswrapper[4778]: I0318 10:33:47.033902 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:47 crc kubenswrapper[4778]: I0318 10:33:47.097295 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:47 crc kubenswrapper[4778]: I0318 10:33:47.157593 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:47 crc kubenswrapper[4778]: I0318 10:33:47.698313 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.091350 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6h72t" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="registry-server" containerID="cri-o://6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0" gracePeriod=2 Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.693401 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.807533 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities\") pod \"035e3898-3c1a-459c-9fee-a9e16ce10874\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.807600 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29ngl\" (UniqueName: \"kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl\") pod \"035e3898-3c1a-459c-9fee-a9e16ce10874\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.807627 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content\") pod \"035e3898-3c1a-459c-9fee-a9e16ce10874\" (UID: \"035e3898-3c1a-459c-9fee-a9e16ce10874\") " Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.808445 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities" (OuterVolumeSpecName: "utilities") pod "035e3898-3c1a-459c-9fee-a9e16ce10874" (UID: "035e3898-3c1a-459c-9fee-a9e16ce10874"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.816055 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl" (OuterVolumeSpecName: "kube-api-access-29ngl") pod "035e3898-3c1a-459c-9fee-a9e16ce10874" (UID: "035e3898-3c1a-459c-9fee-a9e16ce10874"). InnerVolumeSpecName "kube-api-access-29ngl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.858565 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "035e3898-3c1a-459c-9fee-a9e16ce10874" (UID: "035e3898-3c1a-459c-9fee-a9e16ce10874"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.909553 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.909597 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29ngl\" (UniqueName: \"kubernetes.io/projected/035e3898-3c1a-459c-9fee-a9e16ce10874-kube-api-access-29ngl\") on node \"crc\" DevicePath \"\"" Mar 18 10:33:49 crc kubenswrapper[4778]: I0318 10:33:49.909612 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/035e3898-3c1a-459c-9fee-a9e16ce10874-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.106160 4778 generic.go:334] "Generic (PLEG): container finished" podID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerID="6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0" exitCode=0 Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.106255 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6h72t" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.106264 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerDied","Data":"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0"} Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.111425 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6h72t" event={"ID":"035e3898-3c1a-459c-9fee-a9e16ce10874","Type":"ContainerDied","Data":"fdd38a00e590f73ce57630781877646092f03c72baf6f4ae18abcb06af9c3f8c"} Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.111462 4778 scope.go:117] "RemoveContainer" containerID="6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.147722 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.148478 4778 scope.go:117] "RemoveContainer" containerID="eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.160167 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6h72t"] Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.173974 4778 scope.go:117] "RemoveContainer" containerID="24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.199994 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" path="/var/lib/kubelet/pods/035e3898-3c1a-459c-9fee-a9e16ce10874/volumes" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.220930 4778 scope.go:117] "RemoveContainer" containerID="6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0" Mar 18 10:33:50 crc kubenswrapper[4778]: E0318 10:33:50.221493 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0\": container with ID starting with 6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0 not found: ID does not exist" containerID="6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.221546 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0"} err="failed to get container status \"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0\": rpc error: code = NotFound desc = could not find container \"6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0\": container with ID starting with 6e2c32cdb3e311d4ae17e4ba41a2cc6e33277a94c43e6cb73dee768c6be6c3f0 not found: ID does not exist" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.221577 4778 scope.go:117] "RemoveContainer" containerID="eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610" Mar 18 10:33:50 crc kubenswrapper[4778]: E0318 10:33:50.222129 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610\": container with ID starting with eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610 not found: ID does not exist" containerID="eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.222231 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610"} err="failed to get container status \"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610\": rpc error: code = NotFound desc = could not find container \"eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610\": container with ID starting with eb24671469b62eb2f5f151a532add487d595c19e5428f2b47d0dae04d65eb610 not found: ID does not exist" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.222277 4778 scope.go:117] "RemoveContainer" containerID="24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b" Mar 18 10:33:50 crc kubenswrapper[4778]: E0318 10:33:50.222599 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b\": container with ID starting with 24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b not found: ID does not exist" containerID="24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b" Mar 18 10:33:50 crc kubenswrapper[4778]: I0318 10:33:50.222650 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b"} err="failed to get container status \"24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b\": rpc error: code = NotFound desc = could not find container \"24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b\": container with ID starting with 24b4de19c9270802ec9b2c94be5b8b6af7423c697a97447179aa33f3ee01212b not found: ID does not exist" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.167460 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563834-6gcpp"] Mar 18 10:34:00 crc kubenswrapper[4778]: E0318 10:34:00.168528 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="extract-utilities" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.168543 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="extract-utilities" Mar 18 10:34:00 crc kubenswrapper[4778]: E0318 10:34:00.168574 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="registry-server" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.168584 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="registry-server" Mar 18 10:34:00 crc kubenswrapper[4778]: E0318 10:34:00.168627 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="extract-content" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.168637 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="extract-content" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.168893 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="035e3898-3c1a-459c-9fee-a9e16ce10874" containerName="registry-server" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.169719 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.188106 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.188807 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.189000 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.213239 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-6gcpp"] Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.325004 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twbl7\" (UniqueName: \"kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7\") pod \"auto-csr-approver-29563834-6gcpp\" (UID: \"03c15483-f10b-4441-8d24-bb2bee9b47d3\") " pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.426986 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twbl7\" (UniqueName: \"kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7\") pod \"auto-csr-approver-29563834-6gcpp\" (UID: \"03c15483-f10b-4441-8d24-bb2bee9b47d3\") " pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.450105 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twbl7\" (UniqueName: \"kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7\") pod \"auto-csr-approver-29563834-6gcpp\" (UID: \"03c15483-f10b-4441-8d24-bb2bee9b47d3\") " pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:00 crc kubenswrapper[4778]: I0318 10:34:00.509319 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:01 crc kubenswrapper[4778]: I0318 10:34:01.010671 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-6gcpp"] Mar 18 10:34:01 crc kubenswrapper[4778]: I0318 10:34:01.225707 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" event={"ID":"03c15483-f10b-4441-8d24-bb2bee9b47d3","Type":"ContainerStarted","Data":"81c0dfe845c2e5038e0c59164d4e4e76a0f4a68e40905d577b613e6ea445a2ba"} Mar 18 10:34:03 crc kubenswrapper[4778]: I0318 10:34:03.244041 4778 generic.go:334] "Generic (PLEG): container finished" podID="03c15483-f10b-4441-8d24-bb2bee9b47d3" containerID="af6dff43614ee6bd06dbb1cfdbe51bab1c623028ca6850357300ea8b7c6fb33a" exitCode=0 Mar 18 10:34:03 crc kubenswrapper[4778]: I0318 10:34:03.244096 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" event={"ID":"03c15483-f10b-4441-8d24-bb2bee9b47d3","Type":"ContainerDied","Data":"af6dff43614ee6bd06dbb1cfdbe51bab1c623028ca6850357300ea8b7c6fb33a"} Mar 18 10:34:04 crc kubenswrapper[4778]: I0318 10:34:04.722507 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:04 crc kubenswrapper[4778]: I0318 10:34:04.820445 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twbl7\" (UniqueName: \"kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7\") pod \"03c15483-f10b-4441-8d24-bb2bee9b47d3\" (UID: \"03c15483-f10b-4441-8d24-bb2bee9b47d3\") " Mar 18 10:34:04 crc kubenswrapper[4778]: I0318 10:34:04.827998 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7" (OuterVolumeSpecName: "kube-api-access-twbl7") pod "03c15483-f10b-4441-8d24-bb2bee9b47d3" (UID: "03c15483-f10b-4441-8d24-bb2bee9b47d3"). InnerVolumeSpecName "kube-api-access-twbl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:34:04 crc kubenswrapper[4778]: I0318 10:34:04.923908 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twbl7\" (UniqueName: \"kubernetes.io/projected/03c15483-f10b-4441-8d24-bb2bee9b47d3-kube-api-access-twbl7\") on node \"crc\" DevicePath \"\"" Mar 18 10:34:05 crc kubenswrapper[4778]: I0318 10:34:05.263844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" event={"ID":"03c15483-f10b-4441-8d24-bb2bee9b47d3","Type":"ContainerDied","Data":"81c0dfe845c2e5038e0c59164d4e4e76a0f4a68e40905d577b613e6ea445a2ba"} Mar 18 10:34:05 crc kubenswrapper[4778]: I0318 10:34:05.263943 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81c0dfe845c2e5038e0c59164d4e4e76a0f4a68e40905d577b613e6ea445a2ba" Mar 18 10:34:05 crc kubenswrapper[4778]: I0318 10:34:05.263890 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-6gcpp" Mar 18 10:34:05 crc kubenswrapper[4778]: I0318 10:34:05.811181 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-bg8zw"] Mar 18 10:34:05 crc kubenswrapper[4778]: I0318 10:34:05.821136 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-bg8zw"] Mar 18 10:34:06 crc kubenswrapper[4778]: I0318 10:34:06.197545 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b507e48-f0a9-4938-ad91-298a6f90aad1" path="/var/lib/kubelet/pods/6b507e48-f0a9-4938-ad91-298a6f90aad1/volumes" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.609921 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:35 crc kubenswrapper[4778]: E0318 10:34:35.610845 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c15483-f10b-4441-8d24-bb2bee9b47d3" containerName="oc" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.610859 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c15483-f10b-4441-8d24-bb2bee9b47d3" containerName="oc" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.611043 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c15483-f10b-4441-8d24-bb2bee9b47d3" containerName="oc" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.612363 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.632695 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.747300 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.747411 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.747641 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzkg\" (UniqueName: \"kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.840125 4778 scope.go:117] "RemoveContainer" containerID="37abf2b00f0327218ef392055e69e5b8b65d6c2f27975137e66e86722fc34dea" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.850110 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzkg\" (UniqueName: \"kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.850265 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.850340 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.850906 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.851641 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.879037 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzkg\" (UniqueName: \"kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg\") pod \"redhat-marketplace-zzj5b\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:35 crc kubenswrapper[4778]: I0318 10:34:35.941915 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:36 crc kubenswrapper[4778]: I0318 10:34:36.491477 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:36 crc kubenswrapper[4778]: I0318 10:34:36.544059 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerStarted","Data":"1d6fc8b17c50bba9a49271f449e1493e2b8a87fde1e532a527688e36d2719ffc"} Mar 18 10:34:37 crc kubenswrapper[4778]: I0318 10:34:37.554145 4778 generic.go:334] "Generic (PLEG): container finished" podID="930e59dd-04aa-4030-b313-a1268b85ea06" containerID="c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c" exitCode=0 Mar 18 10:34:37 crc kubenswrapper[4778]: I0318 10:34:37.554184 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerDied","Data":"c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c"} Mar 18 10:34:39 crc kubenswrapper[4778]: I0318 10:34:39.575794 4778 generic.go:334] "Generic (PLEG): container finished" podID="930e59dd-04aa-4030-b313-a1268b85ea06" containerID="9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6" exitCode=0 Mar 18 10:34:39 crc kubenswrapper[4778]: I0318 10:34:39.575901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerDied","Data":"9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6"} Mar 18 10:34:40 crc kubenswrapper[4778]: I0318 10:34:40.586516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerStarted","Data":"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b"} Mar 18 10:34:40 crc kubenswrapper[4778]: I0318 10:34:40.613867 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zzj5b" podStartSLOduration=2.796360323 podStartE2EDuration="5.613844764s" podCreationTimestamp="2026-03-18 10:34:35 +0000 UTC" firstStartedPulling="2026-03-18 10:34:37.556341644 +0000 UTC m=+5544.131086484" lastFinishedPulling="2026-03-18 10:34:40.373826085 +0000 UTC m=+5546.948570925" observedRunningTime="2026-03-18 10:34:40.605943939 +0000 UTC m=+5547.180688789" watchObservedRunningTime="2026-03-18 10:34:40.613844764 +0000 UTC m=+5547.188589614" Mar 18 10:34:45 crc kubenswrapper[4778]: I0318 10:34:45.942051 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:45 crc kubenswrapper[4778]: I0318 10:34:45.942568 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:46 crc kubenswrapper[4778]: I0318 10:34:46.011423 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:46 crc kubenswrapper[4778]: I0318 10:34:46.722615 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:46 crc kubenswrapper[4778]: I0318 10:34:46.770713 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:48 crc kubenswrapper[4778]: I0318 10:34:48.687705 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zzj5b" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="registry-server" containerID="cri-o://f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b" gracePeriod=2 Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.281938 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.330157 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvzkg\" (UniqueName: \"kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg\") pod \"930e59dd-04aa-4030-b313-a1268b85ea06\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.330412 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities\") pod \"930e59dd-04aa-4030-b313-a1268b85ea06\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.330459 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content\") pod \"930e59dd-04aa-4030-b313-a1268b85ea06\" (UID: \"930e59dd-04aa-4030-b313-a1268b85ea06\") " Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.332390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities" (OuterVolumeSpecName: "utilities") pod "930e59dd-04aa-4030-b313-a1268b85ea06" (UID: "930e59dd-04aa-4030-b313-a1268b85ea06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.335967 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg" (OuterVolumeSpecName: "kube-api-access-lvzkg") pod "930e59dd-04aa-4030-b313-a1268b85ea06" (UID: "930e59dd-04aa-4030-b313-a1268b85ea06"). InnerVolumeSpecName "kube-api-access-lvzkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.360931 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "930e59dd-04aa-4030-b313-a1268b85ea06" (UID: "930e59dd-04aa-4030-b313-a1268b85ea06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.433249 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvzkg\" (UniqueName: \"kubernetes.io/projected/930e59dd-04aa-4030-b313-a1268b85ea06-kube-api-access-lvzkg\") on node \"crc\" DevicePath \"\"" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.433301 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.433313 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930e59dd-04aa-4030-b313-a1268b85ea06-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.696370 4778 generic.go:334] "Generic (PLEG): container finished" podID="930e59dd-04aa-4030-b313-a1268b85ea06" containerID="f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b" exitCode=0 Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.696434 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzj5b" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.696452 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerDied","Data":"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b"} Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.697564 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzj5b" event={"ID":"930e59dd-04aa-4030-b313-a1268b85ea06","Type":"ContainerDied","Data":"1d6fc8b17c50bba9a49271f449e1493e2b8a87fde1e532a527688e36d2719ffc"} Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.697594 4778 scope.go:117] "RemoveContainer" containerID="f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.722502 4778 scope.go:117] "RemoveContainer" containerID="9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.736882 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.749974 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzj5b"] Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.759303 4778 scope.go:117] "RemoveContainer" containerID="c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.787847 4778 scope.go:117] "RemoveContainer" containerID="f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b" Mar 18 10:34:49 crc kubenswrapper[4778]: E0318 10:34:49.788278 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b\": container with ID starting with f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b not found: ID does not exist" containerID="f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.788308 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b"} err="failed to get container status \"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b\": rpc error: code = NotFound desc = could not find container \"f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b\": container with ID starting with f1cc46f3610ef40452257731d946b2964fbbc3242fa58248e2820f0e236cab9b not found: ID does not exist" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.788328 4778 scope.go:117] "RemoveContainer" containerID="9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6" Mar 18 10:34:49 crc kubenswrapper[4778]: E0318 10:34:49.788690 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6\": container with ID starting with 9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6 not found: ID does not exist" containerID="9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.788714 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6"} err="failed to get container status \"9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6\": rpc error: code = NotFound desc = could not find container \"9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6\": container with ID starting with 9040a6c76d7562170178423b5dc262ba5426f98a692c4204229eafb48c6d07b6 not found: ID does not exist" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.788730 4778 scope.go:117] "RemoveContainer" containerID="c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c" Mar 18 10:34:49 crc kubenswrapper[4778]: E0318 10:34:49.788950 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c\": container with ID starting with c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c not found: ID does not exist" containerID="c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c" Mar 18 10:34:49 crc kubenswrapper[4778]: I0318 10:34:49.788970 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c"} err="failed to get container status \"c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c\": rpc error: code = NotFound desc = could not find container \"c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c\": container with ID starting with c0c64fba4a0e2ed3951b58a008a0be04598570dc31a214f2990bc2ad3080244c not found: ID does not exist" Mar 18 10:34:50 crc kubenswrapper[4778]: I0318 10:34:50.197811 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" path="/var/lib/kubelet/pods/930e59dd-04aa-4030-b313-a1268b85ea06/volumes" Mar 18 10:35:00 crc kubenswrapper[4778]: I0318 10:35:00.147642 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:35:00 crc kubenswrapper[4778]: I0318 10:35:00.149549 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:35:30 crc kubenswrapper[4778]: I0318 10:35:30.147665 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:35:30 crc kubenswrapper[4778]: I0318 10:35:30.148549 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.147705 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.149361 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.149480 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.150588 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.151011 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b" gracePeriod=600 Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.176797 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563836-vj8r7"] Mar 18 10:36:00 crc kubenswrapper[4778]: E0318 10:36:00.177189 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="extract-utilities" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.177223 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="extract-utilities" Mar 18 10:36:00 crc kubenswrapper[4778]: E0318 10:36:00.177232 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="extract-content" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.177239 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="extract-content" Mar 18 10:36:00 crc kubenswrapper[4778]: E0318 10:36:00.177248 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="registry-server" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.177255 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="registry-server" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.177476 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="930e59dd-04aa-4030-b313-a1268b85ea06" containerName="registry-server" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.178065 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.186884 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.187117 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.191941 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.209288 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-vj8r7"] Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.348158 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmkl4\" (UniqueName: \"kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4\") pod \"auto-csr-approver-29563836-vj8r7\" (UID: \"e2bea522-9825-4193-9fb8-6592bcc1e2c8\") " pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.413495 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b" exitCode=0 Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.413585 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b"} Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.413636 4778 scope.go:117] "RemoveContainer" containerID="aa616d11eaf28ac2c986ff88856788be076b7a2e9e5daff9a6df366b33eacc36" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.450570 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmkl4\" (UniqueName: \"kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4\") pod \"auto-csr-approver-29563836-vj8r7\" (UID: \"e2bea522-9825-4193-9fb8-6592bcc1e2c8\") " pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.473225 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmkl4\" (UniqueName: \"kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4\") pod \"auto-csr-approver-29563836-vj8r7\" (UID: \"e2bea522-9825-4193-9fb8-6592bcc1e2c8\") " pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:00 crc kubenswrapper[4778]: I0318 10:36:00.512268 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:01 crc kubenswrapper[4778]: I0318 10:36:01.001457 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-vj8r7"] Mar 18 10:36:01 crc kubenswrapper[4778]: W0318 10:36:01.010279 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2bea522_9825_4193_9fb8_6592bcc1e2c8.slice/crio-fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa WatchSource:0}: Error finding container fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa: Status 404 returned error can't find the container with id fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa Mar 18 10:36:01 crc kubenswrapper[4778]: I0318 10:36:01.424423 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5"} Mar 18 10:36:01 crc kubenswrapper[4778]: I0318 10:36:01.425871 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" event={"ID":"e2bea522-9825-4193-9fb8-6592bcc1e2c8","Type":"ContainerStarted","Data":"fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa"} Mar 18 10:36:02 crc kubenswrapper[4778]: I0318 10:36:02.437307 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" event={"ID":"e2bea522-9825-4193-9fb8-6592bcc1e2c8","Type":"ContainerStarted","Data":"927eb69f67f4add2a332f661db657c63610291a16f8bad5b3ab6b4321fca9e64"} Mar 18 10:36:03 crc kubenswrapper[4778]: I0318 10:36:03.447517 4778 generic.go:334] "Generic (PLEG): container finished" podID="e2bea522-9825-4193-9fb8-6592bcc1e2c8" containerID="927eb69f67f4add2a332f661db657c63610291a16f8bad5b3ab6b4321fca9e64" exitCode=0 Mar 18 10:36:03 crc kubenswrapper[4778]: I0318 10:36:03.447664 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" event={"ID":"e2bea522-9825-4193-9fb8-6592bcc1e2c8","Type":"ContainerDied","Data":"927eb69f67f4add2a332f661db657c63610291a16f8bad5b3ab6b4321fca9e64"} Mar 18 10:36:04 crc kubenswrapper[4778]: I0318 10:36:04.959443 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.154949 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmkl4\" (UniqueName: \"kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4\") pod \"e2bea522-9825-4193-9fb8-6592bcc1e2c8\" (UID: \"e2bea522-9825-4193-9fb8-6592bcc1e2c8\") " Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.169611 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4" (OuterVolumeSpecName: "kube-api-access-rmkl4") pod "e2bea522-9825-4193-9fb8-6592bcc1e2c8" (UID: "e2bea522-9825-4193-9fb8-6592bcc1e2c8"). InnerVolumeSpecName "kube-api-access-rmkl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.257927 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmkl4\" (UniqueName: \"kubernetes.io/projected/e2bea522-9825-4193-9fb8-6592bcc1e2c8-kube-api-access-rmkl4\") on node \"crc\" DevicePath \"\"" Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.485329 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" event={"ID":"e2bea522-9825-4193-9fb8-6592bcc1e2c8","Type":"ContainerDied","Data":"fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa"} Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.485716 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcf5a420915901661b1c58125c78978528b2f6cc296d9385b6bc42cb835e12fa" Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.485499 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-vj8r7" Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.555921 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-cmbbv"] Mar 18 10:36:05 crc kubenswrapper[4778]: I0318 10:36:05.571098 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-cmbbv"] Mar 18 10:36:06 crc kubenswrapper[4778]: I0318 10:36:06.201621 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020a579d-1395-4039-8c3a-7454709e9af6" path="/var/lib/kubelet/pods/020a579d-1395-4039-8c3a-7454709e9af6/volumes" Mar 18 10:36:36 crc kubenswrapper[4778]: I0318 10:36:36.006235 4778 scope.go:117] "RemoveContainer" containerID="e298907ce2b631ab1e3060efd7429ad70a3f2d93551c33b2a088ad16a12f01ae" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.148313 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.149286 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.183252 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563838-xhh59"] Mar 18 10:38:00 crc kubenswrapper[4778]: E0318 10:38:00.183937 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bea522-9825-4193-9fb8-6592bcc1e2c8" containerName="oc" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.183965 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bea522-9825-4193-9fb8-6592bcc1e2c8" containerName="oc" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.184284 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bea522-9825-4193-9fb8-6592bcc1e2c8" containerName="oc" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.185382 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.188785 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.188848 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.188985 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.204806 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-xhh59"] Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.230834 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7njtv\" (UniqueName: \"kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv\") pod \"auto-csr-approver-29563838-xhh59\" (UID: \"06d0e7b4-0fff-4364-bef0-a408acdbcdbb\") " pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.332966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7njtv\" (UniqueName: \"kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv\") pod \"auto-csr-approver-29563838-xhh59\" (UID: \"06d0e7b4-0fff-4364-bef0-a408acdbcdbb\") " pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.363821 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7njtv\" (UniqueName: \"kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv\") pod \"auto-csr-approver-29563838-xhh59\" (UID: \"06d0e7b4-0fff-4364-bef0-a408acdbcdbb\") " pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:00 crc kubenswrapper[4778]: I0318 10:38:00.529062 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:01 crc kubenswrapper[4778]: I0318 10:38:01.016371 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-xhh59"] Mar 18 10:38:01 crc kubenswrapper[4778]: W0318 10:38:01.022181 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06d0e7b4_0fff_4364_bef0_a408acdbcdbb.slice/crio-95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a WatchSource:0}: Error finding container 95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a: Status 404 returned error can't find the container with id 95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a Mar 18 10:38:01 crc kubenswrapper[4778]: I0318 10:38:01.794749 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563838-xhh59" event={"ID":"06d0e7b4-0fff-4364-bef0-a408acdbcdbb","Type":"ContainerStarted","Data":"95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a"} Mar 18 10:38:02 crc kubenswrapper[4778]: I0318 10:38:02.809754 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563838-xhh59" event={"ID":"06d0e7b4-0fff-4364-bef0-a408acdbcdbb","Type":"ContainerStarted","Data":"646650d3ca5a64cb0621f10e2cff5ed5ebd691e96f89d98ddbd2ca48d396fddb"} Mar 18 10:38:02 crc kubenswrapper[4778]: I0318 10:38:02.835697 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563838-xhh59" podStartSLOduration=1.493443228 podStartE2EDuration="2.835679986s" podCreationTimestamp="2026-03-18 10:38:00 +0000 UTC" firstStartedPulling="2026-03-18 10:38:01.02487797 +0000 UTC m=+5747.599622850" lastFinishedPulling="2026-03-18 10:38:02.367114778 +0000 UTC m=+5748.941859608" observedRunningTime="2026-03-18 10:38:02.832348495 +0000 UTC m=+5749.407093345" watchObservedRunningTime="2026-03-18 10:38:02.835679986 +0000 UTC m=+5749.410424846" Mar 18 10:38:03 crc kubenswrapper[4778]: I0318 10:38:03.823775 4778 generic.go:334] "Generic (PLEG): container finished" podID="06d0e7b4-0fff-4364-bef0-a408acdbcdbb" containerID="646650d3ca5a64cb0621f10e2cff5ed5ebd691e96f89d98ddbd2ca48d396fddb" exitCode=0 Mar 18 10:38:03 crc kubenswrapper[4778]: I0318 10:38:03.823940 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563838-xhh59" event={"ID":"06d0e7b4-0fff-4364-bef0-a408acdbcdbb","Type":"ContainerDied","Data":"646650d3ca5a64cb0621f10e2cff5ed5ebd691e96f89d98ddbd2ca48d396fddb"} Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.333970 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.464575 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7njtv\" (UniqueName: \"kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv\") pod \"06d0e7b4-0fff-4364-bef0-a408acdbcdbb\" (UID: \"06d0e7b4-0fff-4364-bef0-a408acdbcdbb\") " Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.472460 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv" (OuterVolumeSpecName: "kube-api-access-7njtv") pod "06d0e7b4-0fff-4364-bef0-a408acdbcdbb" (UID: "06d0e7b4-0fff-4364-bef0-a408acdbcdbb"). InnerVolumeSpecName "kube-api-access-7njtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.566861 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7njtv\" (UniqueName: \"kubernetes.io/projected/06d0e7b4-0fff-4364-bef0-a408acdbcdbb-kube-api-access-7njtv\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.842712 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563838-xhh59" event={"ID":"06d0e7b4-0fff-4364-bef0-a408acdbcdbb","Type":"ContainerDied","Data":"95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a"} Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.842758 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95450eddfc15632f0a41a2ff2c235011258dec14c97bc62d889c97127ad1e24a" Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.842792 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-xhh59" Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.908823 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-8vv9t"] Mar 18 10:38:05 crc kubenswrapper[4778]: I0318 10:38:05.918231 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-8vv9t"] Mar 18 10:38:06 crc kubenswrapper[4778]: I0318 10:38:06.205284 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f52f16-1c49-4aa8-9e7b-10a9bf55e487" path="/var/lib/kubelet/pods/19f52f16-1c49-4aa8-9e7b-10a9bf55e487/volumes" Mar 18 10:38:30 crc kubenswrapper[4778]: I0318 10:38:30.147122 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:38:30 crc kubenswrapper[4778]: I0318 10:38:30.147844 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.496145 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:32 crc kubenswrapper[4778]: E0318 10:38:32.497472 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d0e7b4-0fff-4364-bef0-a408acdbcdbb" containerName="oc" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.497498 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d0e7b4-0fff-4364-bef0-a408acdbcdbb" containerName="oc" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.497823 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d0e7b4-0fff-4364-bef0-a408acdbcdbb" containerName="oc" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.500070 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.517970 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.642522 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.642852 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmn8p\" (UniqueName: \"kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.642999 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.745249 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.745609 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmn8p\" (UniqueName: \"kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.745706 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.745762 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.746271 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.780131 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmn8p\" (UniqueName: \"kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p\") pod \"redhat-operators-zvhbg\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:32 crc kubenswrapper[4778]: I0318 10:38:32.824935 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:33 crc kubenswrapper[4778]: I0318 10:38:33.260540 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:34 crc kubenswrapper[4778]: I0318 10:38:34.107026 4778 generic.go:334] "Generic (PLEG): container finished" podID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerID="154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca" exitCode=0 Mar 18 10:38:34 crc kubenswrapper[4778]: I0318 10:38:34.107131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerDied","Data":"154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca"} Mar 18 10:38:34 crc kubenswrapper[4778]: I0318 10:38:34.107383 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerStarted","Data":"99bf290ef30cb2d2aba1ab1168b5a2679f44c5fe78de34c90d88b0c1396003bf"} Mar 18 10:38:36 crc kubenswrapper[4778]: I0318 10:38:36.104334 4778 scope.go:117] "RemoveContainer" containerID="7efa6734e40475f70dcca064e44946d2776507dea2029d7bc3055a574ac8a45a" Mar 18 10:38:36 crc kubenswrapper[4778]: I0318 10:38:36.149417 4778 scope.go:117] "RemoveContainer" containerID="0bc612bd31fa753f8d2f562fbc9812f902bbf84c5c3fafa32c598ca1eddc2ac0" Mar 18 10:38:36 crc kubenswrapper[4778]: I0318 10:38:36.156848 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerStarted","Data":"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6"} Mar 18 10:38:36 crc kubenswrapper[4778]: I0318 10:38:36.242410 4778 scope.go:117] "RemoveContainer" containerID="b3e45b3c111cb68776e6b0e92c1ba10a6ec66666310504f0fe918ad9a95b9a9e" Mar 18 10:38:36 crc kubenswrapper[4778]: I0318 10:38:36.314255 4778 scope.go:117] "RemoveContainer" containerID="d4ab5a2a9be3faa54895e3ea4c33073a1984cc4328299d0127067c77c01301e8" Mar 18 10:38:40 crc kubenswrapper[4778]: I0318 10:38:40.198729 4778 generic.go:334] "Generic (PLEG): container finished" podID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerID="b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6" exitCode=0 Mar 18 10:38:40 crc kubenswrapper[4778]: I0318 10:38:40.198811 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerDied","Data":"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6"} Mar 18 10:38:40 crc kubenswrapper[4778]: I0318 10:38:40.202010 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:38:41 crc kubenswrapper[4778]: I0318 10:38:41.211225 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerStarted","Data":"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b"} Mar 18 10:38:41 crc kubenswrapper[4778]: I0318 10:38:41.288313 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zvhbg" podStartSLOduration=2.7254747999999998 podStartE2EDuration="9.288291012s" podCreationTimestamp="2026-03-18 10:38:32 +0000 UTC" firstStartedPulling="2026-03-18 10:38:34.109231159 +0000 UTC m=+5780.683975999" lastFinishedPulling="2026-03-18 10:38:40.672047361 +0000 UTC m=+5787.246792211" observedRunningTime="2026-03-18 10:38:41.239105268 +0000 UTC m=+5787.813850138" watchObservedRunningTime="2026-03-18 10:38:41.288291012 +0000 UTC m=+5787.863035852" Mar 18 10:38:42 crc kubenswrapper[4778]: I0318 10:38:42.825541 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:42 crc kubenswrapper[4778]: I0318 10:38:42.826025 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:43 crc kubenswrapper[4778]: I0318 10:38:43.874567 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zvhbg" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="registry-server" probeResult="failure" output=< Mar 18 10:38:43 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:38:43 crc kubenswrapper[4778]: > Mar 18 10:38:52 crc kubenswrapper[4778]: I0318 10:38:52.884706 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:52 crc kubenswrapper[4778]: I0318 10:38:52.948885 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:53 crc kubenswrapper[4778]: I0318 10:38:53.123224 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:54 crc kubenswrapper[4778]: I0318 10:38:54.315147 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zvhbg" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="registry-server" containerID="cri-o://b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b" gracePeriod=2 Mar 18 10:38:54 crc kubenswrapper[4778]: I0318 10:38:54.985598 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.099218 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities\") pod \"934fd461-09e2-4014-84fb-c5cdf66dd804\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.099838 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content\") pod \"934fd461-09e2-4014-84fb-c5cdf66dd804\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.099912 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmn8p\" (UniqueName: \"kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p\") pod \"934fd461-09e2-4014-84fb-c5cdf66dd804\" (UID: \"934fd461-09e2-4014-84fb-c5cdf66dd804\") " Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.100392 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities" (OuterVolumeSpecName: "utilities") pod "934fd461-09e2-4014-84fb-c5cdf66dd804" (UID: "934fd461-09e2-4014-84fb-c5cdf66dd804"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.107774 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p" (OuterVolumeSpecName: "kube-api-access-tmn8p") pod "934fd461-09e2-4014-84fb-c5cdf66dd804" (UID: "934fd461-09e2-4014-84fb-c5cdf66dd804"). InnerVolumeSpecName "kube-api-access-tmn8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.202023 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.202087 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmn8p\" (UniqueName: \"kubernetes.io/projected/934fd461-09e2-4014-84fb-c5cdf66dd804-kube-api-access-tmn8p\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.253835 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "934fd461-09e2-4014-84fb-c5cdf66dd804" (UID: "934fd461-09e2-4014-84fb-c5cdf66dd804"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.303693 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/934fd461-09e2-4014-84fb-c5cdf66dd804-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.326831 4778 generic.go:334] "Generic (PLEG): container finished" podID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerID="b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b" exitCode=0 Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.326880 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerDied","Data":"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b"} Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.326910 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvhbg" event={"ID":"934fd461-09e2-4014-84fb-c5cdf66dd804","Type":"ContainerDied","Data":"99bf290ef30cb2d2aba1ab1168b5a2679f44c5fe78de34c90d88b0c1396003bf"} Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.326908 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvhbg" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.326926 4778 scope.go:117] "RemoveContainer" containerID="b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.357186 4778 scope.go:117] "RemoveContainer" containerID="b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.371756 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.382790 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zvhbg"] Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.385485 4778 scope.go:117] "RemoveContainer" containerID="154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.429659 4778 scope.go:117] "RemoveContainer" containerID="b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b" Mar 18 10:38:55 crc kubenswrapper[4778]: E0318 10:38:55.431067 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b\": container with ID starting with b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b not found: ID does not exist" containerID="b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.431116 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b"} err="failed to get container status \"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b\": rpc error: code = NotFound desc = could not find container \"b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b\": container with ID starting with b1d074db0e273c56847f2d18f176c790b5b54fd2487eaeff3f058ec97bbf175b not found: ID does not exist" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.431149 4778 scope.go:117] "RemoveContainer" containerID="b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6" Mar 18 10:38:55 crc kubenswrapper[4778]: E0318 10:38:55.432288 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6\": container with ID starting with b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6 not found: ID does not exist" containerID="b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.432328 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6"} err="failed to get container status \"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6\": rpc error: code = NotFound desc = could not find container \"b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6\": container with ID starting with b11935d578ed05fbe284debe079452e545bb7cc9b3adb6b4ef27718909045cf6 not found: ID does not exist" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.432348 4778 scope.go:117] "RemoveContainer" containerID="154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca" Mar 18 10:38:55 crc kubenswrapper[4778]: E0318 10:38:55.433570 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca\": container with ID starting with 154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca not found: ID does not exist" containerID="154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca" Mar 18 10:38:55 crc kubenswrapper[4778]: I0318 10:38:55.433615 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca"} err="failed to get container status \"154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca\": rpc error: code = NotFound desc = could not find container \"154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca\": container with ID starting with 154ad4a3da08d956fccaaca4c67116d8611ece7dce543c8aaff130c23140deca not found: ID does not exist" Mar 18 10:38:56 crc kubenswrapper[4778]: I0318 10:38:56.198287 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" path="/var/lib/kubelet/pods/934fd461-09e2-4014-84fb-c5cdf66dd804/volumes" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.147938 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.149137 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.149242 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.150483 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.150566 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" gracePeriod=600 Mar 18 10:39:00 crc kubenswrapper[4778]: E0318 10:39:00.273920 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.377131 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" exitCode=0 Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.377226 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5"} Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.377500 4778 scope.go:117] "RemoveContainer" containerID="81ef1098ba06a49fc09cc93fdfc6d75d1ab7de8afe5ec57fbd67f4e799fb737b" Mar 18 10:39:00 crc kubenswrapper[4778]: I0318 10:39:00.378132 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:39:00 crc kubenswrapper[4778]: E0318 10:39:00.378450 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:39:16 crc kubenswrapper[4778]: I0318 10:39:16.191012 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:39:16 crc kubenswrapper[4778]: E0318 10:39:16.191875 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:39:30 crc kubenswrapper[4778]: I0318 10:39:30.187406 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:39:30 crc kubenswrapper[4778]: E0318 10:39:30.188666 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:39:43 crc kubenswrapper[4778]: I0318 10:39:43.188753 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:39:43 crc kubenswrapper[4778]: E0318 10:39:43.189607 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:39:54 crc kubenswrapper[4778]: I0318 10:39:54.197516 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:39:54 crc kubenswrapper[4778]: E0318 10:39:54.198321 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.155950 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563840-fv5mm"] Mar 18 10:40:00 crc kubenswrapper[4778]: E0318 10:40:00.156856 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="extract-utilities" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.156871 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="extract-utilities" Mar 18 10:40:00 crc kubenswrapper[4778]: E0318 10:40:00.156894 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="extract-content" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.156919 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="extract-content" Mar 18 10:40:00 crc kubenswrapper[4778]: E0318 10:40:00.156939 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="registry-server" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.156949 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="registry-server" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.157180 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="934fd461-09e2-4014-84fb-c5cdf66dd804" containerName="registry-server" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.158125 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.160216 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.160295 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.160485 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.167699 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-fv5mm"] Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.245782 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcr6l\" (UniqueName: \"kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l\") pod \"auto-csr-approver-29563840-fv5mm\" (UID: \"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e\") " pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.347448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcr6l\" (UniqueName: \"kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l\") pod \"auto-csr-approver-29563840-fv5mm\" (UID: \"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e\") " pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.368139 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcr6l\" (UniqueName: \"kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l\") pod \"auto-csr-approver-29563840-fv5mm\" (UID: \"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e\") " pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.482599 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:00 crc kubenswrapper[4778]: I0318 10:40:00.990122 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-fv5mm"] Mar 18 10:40:01 crc kubenswrapper[4778]: I0318 10:40:01.918404 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" event={"ID":"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e","Type":"ContainerStarted","Data":"49451722c5ce52687662ad1e30323c0a103145961602751f1e8768d627cdc406"} Mar 18 10:40:02 crc kubenswrapper[4778]: I0318 10:40:02.926443 4778 generic.go:334] "Generic (PLEG): container finished" podID="dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" containerID="105700f78835bb2b225e76573d66e982ecb74a475715ed5f6fa69ca1e19eafce" exitCode=0 Mar 18 10:40:02 crc kubenswrapper[4778]: I0318 10:40:02.928153 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" event={"ID":"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e","Type":"ContainerDied","Data":"105700f78835bb2b225e76573d66e982ecb74a475715ed5f6fa69ca1e19eafce"} Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.558827 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.643886 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcr6l\" (UniqueName: \"kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l\") pod \"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e\" (UID: \"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e\") " Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.655561 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l" (OuterVolumeSpecName: "kube-api-access-zcr6l") pod "dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" (UID: "dc3d0dd7-380a-4f1c-bb78-f8df1a73362e"). InnerVolumeSpecName "kube-api-access-zcr6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.746617 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcr6l\" (UniqueName: \"kubernetes.io/projected/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e-kube-api-access-zcr6l\") on node \"crc\" DevicePath \"\"" Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.944612 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" event={"ID":"dc3d0dd7-380a-4f1c-bb78-f8df1a73362e","Type":"ContainerDied","Data":"49451722c5ce52687662ad1e30323c0a103145961602751f1e8768d627cdc406"} Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.944813 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-fv5mm" Mar 18 10:40:04 crc kubenswrapper[4778]: I0318 10:40:04.944901 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49451722c5ce52687662ad1e30323c0a103145961602751f1e8768d627cdc406" Mar 18 10:40:05 crc kubenswrapper[4778]: I0318 10:40:05.630711 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-6gcpp"] Mar 18 10:40:05 crc kubenswrapper[4778]: I0318 10:40:05.639514 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-6gcpp"] Mar 18 10:40:06 crc kubenswrapper[4778]: I0318 10:40:06.187409 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:40:06 crc kubenswrapper[4778]: E0318 10:40:06.187700 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:40:06 crc kubenswrapper[4778]: I0318 10:40:06.196419 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03c15483-f10b-4441-8d24-bb2bee9b47d3" path="/var/lib/kubelet/pods/03c15483-f10b-4441-8d24-bb2bee9b47d3/volumes" Mar 18 10:40:19 crc kubenswrapper[4778]: I0318 10:40:19.187337 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:40:19 crc kubenswrapper[4778]: E0318 10:40:19.188101 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:40:32 crc kubenswrapper[4778]: I0318 10:40:32.187809 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:40:32 crc kubenswrapper[4778]: E0318 10:40:32.188621 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:40:36 crc kubenswrapper[4778]: I0318 10:40:36.460586 4778 scope.go:117] "RemoveContainer" containerID="af6dff43614ee6bd06dbb1cfdbe51bab1c623028ca6850357300ea8b7c6fb33a" Mar 18 10:40:44 crc kubenswrapper[4778]: I0318 10:40:44.193558 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:40:44 crc kubenswrapper[4778]: E0318 10:40:44.195972 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:40:59 crc kubenswrapper[4778]: I0318 10:40:59.187634 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:40:59 crc kubenswrapper[4778]: E0318 10:40:59.188653 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:41:10 crc kubenswrapper[4778]: I0318 10:41:10.187743 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:41:10 crc kubenswrapper[4778]: E0318 10:41:10.188989 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:41:23 crc kubenswrapper[4778]: I0318 10:41:23.187969 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:41:23 crc kubenswrapper[4778]: E0318 10:41:23.189368 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:41:37 crc kubenswrapper[4778]: I0318 10:41:37.187051 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:41:37 crc kubenswrapper[4778]: E0318 10:41:37.188821 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:41:50 crc kubenswrapper[4778]: I0318 10:41:50.187024 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:41:50 crc kubenswrapper[4778]: E0318 10:41:50.188047 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.161227 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563842-ckkmp"] Mar 18 10:42:00 crc kubenswrapper[4778]: E0318 10:42:00.162723 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" containerName="oc" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.162751 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" containerName="oc" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.163254 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" containerName="oc" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.164502 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.166790 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.167008 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.167086 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.174428 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-ckkmp"] Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.181023 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf9fm\" (UniqueName: \"kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm\") pod \"auto-csr-approver-29563842-ckkmp\" (UID: \"d10faaed-ffef-4afb-9f75-262e4fccd22a\") " pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.283437 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf9fm\" (UniqueName: \"kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm\") pod \"auto-csr-approver-29563842-ckkmp\" (UID: \"d10faaed-ffef-4afb-9f75-262e4fccd22a\") " pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.306998 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf9fm\" (UniqueName: \"kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm\") pod \"auto-csr-approver-29563842-ckkmp\" (UID: \"d10faaed-ffef-4afb-9f75-262e4fccd22a\") " pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.487960 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:00 crc kubenswrapper[4778]: I0318 10:42:00.955984 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-ckkmp"] Mar 18 10:42:01 crc kubenswrapper[4778]: I0318 10:42:01.027038 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" event={"ID":"d10faaed-ffef-4afb-9f75-262e4fccd22a","Type":"ContainerStarted","Data":"ba2519ba60c141d970d8f26f1bfe017190303e07f1b25f56192925fa45efa340"} Mar 18 10:42:03 crc kubenswrapper[4778]: I0318 10:42:03.054556 4778 generic.go:334] "Generic (PLEG): container finished" podID="d10faaed-ffef-4afb-9f75-262e4fccd22a" containerID="cec3fa048e9699e703eb8a3404384f6f46bb6a98f37648f4a97cf2fe11dab009" exitCode=0 Mar 18 10:42:03 crc kubenswrapper[4778]: I0318 10:42:03.054925 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" event={"ID":"d10faaed-ffef-4afb-9f75-262e4fccd22a","Type":"ContainerDied","Data":"cec3fa048e9699e703eb8a3404384f6f46bb6a98f37648f4a97cf2fe11dab009"} Mar 18 10:42:04 crc kubenswrapper[4778]: I0318 10:42:04.595078 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:04 crc kubenswrapper[4778]: I0318 10:42:04.782043 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf9fm\" (UniqueName: \"kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm\") pod \"d10faaed-ffef-4afb-9f75-262e4fccd22a\" (UID: \"d10faaed-ffef-4afb-9f75-262e4fccd22a\") " Mar 18 10:42:04 crc kubenswrapper[4778]: I0318 10:42:04.793597 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm" (OuterVolumeSpecName: "kube-api-access-jf9fm") pod "d10faaed-ffef-4afb-9f75-262e4fccd22a" (UID: "d10faaed-ffef-4afb-9f75-262e4fccd22a"). InnerVolumeSpecName "kube-api-access-jf9fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:42:04 crc kubenswrapper[4778]: I0318 10:42:04.884678 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf9fm\" (UniqueName: \"kubernetes.io/projected/d10faaed-ffef-4afb-9f75-262e4fccd22a-kube-api-access-jf9fm\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.078744 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" event={"ID":"d10faaed-ffef-4afb-9f75-262e4fccd22a","Type":"ContainerDied","Data":"ba2519ba60c141d970d8f26f1bfe017190303e07f1b25f56192925fa45efa340"} Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.078824 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba2519ba60c141d970d8f26f1bfe017190303e07f1b25f56192925fa45efa340" Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.078933 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-ckkmp" Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.192866 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:42:05 crc kubenswrapper[4778]: E0318 10:42:05.193345 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.685102 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-vj8r7"] Mar 18 10:42:05 crc kubenswrapper[4778]: I0318 10:42:05.699020 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-vj8r7"] Mar 18 10:42:06 crc kubenswrapper[4778]: I0318 10:42:06.202554 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2bea522-9825-4193-9fb8-6592bcc1e2c8" path="/var/lib/kubelet/pods/e2bea522-9825-4193-9fb8-6592bcc1e2c8/volumes" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.633068 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:13 crc kubenswrapper[4778]: E0318 10:42:13.634024 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10faaed-ffef-4afb-9f75-262e4fccd22a" containerName="oc" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.634038 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10faaed-ffef-4afb-9f75-262e4fccd22a" containerName="oc" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.634238 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10faaed-ffef-4afb-9f75-262e4fccd22a" containerName="oc" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.635590 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.659803 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.660143 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.660378 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxqvf\" (UniqueName: \"kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.665802 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.762042 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.762173 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.762214 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxqvf\" (UniqueName: \"kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.762575 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.762632 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.788098 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxqvf\" (UniqueName: \"kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf\") pod \"community-operators-jdwm5\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:13 crc kubenswrapper[4778]: I0318 10:42:13.961535 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:14 crc kubenswrapper[4778]: I0318 10:42:14.514238 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:15 crc kubenswrapper[4778]: I0318 10:42:15.179191 4778 generic.go:334] "Generic (PLEG): container finished" podID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerID="e810276274c5d13e9f83a89d4970dccd708f26446183bed277d1f0891733f845" exitCode=0 Mar 18 10:42:15 crc kubenswrapper[4778]: I0318 10:42:15.179281 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerDied","Data":"e810276274c5d13e9f83a89d4970dccd708f26446183bed277d1f0891733f845"} Mar 18 10:42:15 crc kubenswrapper[4778]: I0318 10:42:15.179623 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerStarted","Data":"3b5256ad1745bed880d51fd8ff1844d200a8ecc298d03e6cb8e30102b2353e9f"} Mar 18 10:42:16 crc kubenswrapper[4778]: I0318 10:42:16.198600 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerStarted","Data":"3038181f3cf21b408bb5f7edd299880a51f254baf0e7004d42e611a268d7ae51"} Mar 18 10:42:18 crc kubenswrapper[4778]: I0318 10:42:18.188108 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:42:18 crc kubenswrapper[4778]: E0318 10:42:18.188964 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:42:18 crc kubenswrapper[4778]: I0318 10:42:18.211270 4778 generic.go:334] "Generic (PLEG): container finished" podID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerID="3038181f3cf21b408bb5f7edd299880a51f254baf0e7004d42e611a268d7ae51" exitCode=0 Mar 18 10:42:18 crc kubenswrapper[4778]: I0318 10:42:18.211326 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerDied","Data":"3038181f3cf21b408bb5f7edd299880a51f254baf0e7004d42e611a268d7ae51"} Mar 18 10:42:19 crc kubenswrapper[4778]: I0318 10:42:19.223696 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerStarted","Data":"d0f31064b207c07c3a3138a9916639084cca43b6ad3411af7cd72dd9b0e743df"} Mar 18 10:42:19 crc kubenswrapper[4778]: I0318 10:42:19.259190 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jdwm5" podStartSLOduration=2.795311554 podStartE2EDuration="6.259165868s" podCreationTimestamp="2026-03-18 10:42:13 +0000 UTC" firstStartedPulling="2026-03-18 10:42:15.181097728 +0000 UTC m=+6001.755842578" lastFinishedPulling="2026-03-18 10:42:18.644952052 +0000 UTC m=+6005.219696892" observedRunningTime="2026-03-18 10:42:19.250787291 +0000 UTC m=+6005.825532141" watchObservedRunningTime="2026-03-18 10:42:19.259165868 +0000 UTC m=+6005.833910718" Mar 18 10:42:23 crc kubenswrapper[4778]: I0318 10:42:23.962643 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:23 crc kubenswrapper[4778]: I0318 10:42:23.963118 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:24 crc kubenswrapper[4778]: I0318 10:42:24.020247 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:24 crc kubenswrapper[4778]: I0318 10:42:24.332123 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:24 crc kubenswrapper[4778]: I0318 10:42:24.381028 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:26 crc kubenswrapper[4778]: I0318 10:42:26.301380 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jdwm5" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="registry-server" containerID="cri-o://d0f31064b207c07c3a3138a9916639084cca43b6ad3411af7cd72dd9b0e743df" gracePeriod=2 Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.312996 4778 generic.go:334] "Generic (PLEG): container finished" podID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerID="d0f31064b207c07c3a3138a9916639084cca43b6ad3411af7cd72dd9b0e743df" exitCode=0 Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.313347 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerDied","Data":"d0f31064b207c07c3a3138a9916639084cca43b6ad3411af7cd72dd9b0e743df"} Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.433671 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.538717 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content\") pod \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.538880 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities\") pod \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.538992 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxqvf\" (UniqueName: \"kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf\") pod \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\" (UID: \"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1\") " Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.539823 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities" (OuterVolumeSpecName: "utilities") pod "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" (UID: "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.550082 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf" (OuterVolumeSpecName: "kube-api-access-rxqvf") pod "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" (UID: "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1"). InnerVolumeSpecName "kube-api-access-rxqvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.596294 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" (UID: "52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.644035 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxqvf\" (UniqueName: \"kubernetes.io/projected/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-kube-api-access-rxqvf\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.644069 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:27 crc kubenswrapper[4778]: I0318 10:42:27.644078 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.325041 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdwm5" event={"ID":"52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1","Type":"ContainerDied","Data":"3b5256ad1745bed880d51fd8ff1844d200a8ecc298d03e6cb8e30102b2353e9f"} Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.325120 4778 scope.go:117] "RemoveContainer" containerID="d0f31064b207c07c3a3138a9916639084cca43b6ad3411af7cd72dd9b0e743df" Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.327112 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdwm5" Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.354614 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.360578 4778 scope.go:117] "RemoveContainer" containerID="3038181f3cf21b408bb5f7edd299880a51f254baf0e7004d42e611a268d7ae51" Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.368363 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jdwm5"] Mar 18 10:42:28 crc kubenswrapper[4778]: I0318 10:42:28.425383 4778 scope.go:117] "RemoveContainer" containerID="e810276274c5d13e9f83a89d4970dccd708f26446183bed277d1f0891733f845" Mar 18 10:42:30 crc kubenswrapper[4778]: I0318 10:42:30.202629 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" path="/var/lib/kubelet/pods/52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1/volumes" Mar 18 10:42:32 crc kubenswrapper[4778]: I0318 10:42:32.187357 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:42:32 crc kubenswrapper[4778]: E0318 10:42:32.187793 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:42:36 crc kubenswrapper[4778]: I0318 10:42:36.548305 4778 scope.go:117] "RemoveContainer" containerID="927eb69f67f4add2a332f661db657c63610291a16f8bad5b3ab6b4321fca9e64" Mar 18 10:42:45 crc kubenswrapper[4778]: I0318 10:42:45.187094 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:42:45 crc kubenswrapper[4778]: E0318 10:42:45.187980 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:42:59 crc kubenswrapper[4778]: I0318 10:42:59.187465 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:42:59 crc kubenswrapper[4778]: E0318 10:42:59.188477 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:43:10 crc kubenswrapper[4778]: I0318 10:43:10.189796 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:43:10 crc kubenswrapper[4778]: E0318 10:43:10.190855 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:43:21 crc kubenswrapper[4778]: I0318 10:43:21.187427 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:43:21 crc kubenswrapper[4778]: E0318 10:43:21.188217 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:43:34 crc kubenswrapper[4778]: I0318 10:43:34.202980 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:43:34 crc kubenswrapper[4778]: E0318 10:43:34.203898 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:43:49 crc kubenswrapper[4778]: I0318 10:43:49.187414 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:43:49 crc kubenswrapper[4778]: E0318 10:43:49.188157 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.163964 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563844-j7kb9"] Mar 18 10:44:00 crc kubenswrapper[4778]: E0318 10:44:00.165055 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="extract-content" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.165072 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="extract-content" Mar 18 10:44:00 crc kubenswrapper[4778]: E0318 10:44:00.165093 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="registry-server" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.165101 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="registry-server" Mar 18 10:44:00 crc kubenswrapper[4778]: E0318 10:44:00.165129 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="extract-utilities" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.165137 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="extract-utilities" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.165427 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b6b01e-bb8f-4bf9-b4af-29a7ac4200e1" containerName="registry-server" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.166245 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.169056 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.169314 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.169448 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.179074 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-j7kb9"] Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.236724 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z8sj\" (UniqueName: \"kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj\") pod \"auto-csr-approver-29563844-j7kb9\" (UID: \"2098dac3-962e-4d75-b22f-81aadc768dc6\") " pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.339183 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z8sj\" (UniqueName: \"kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj\") pod \"auto-csr-approver-29563844-j7kb9\" (UID: \"2098dac3-962e-4d75-b22f-81aadc768dc6\") " pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.361436 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z8sj\" (UniqueName: \"kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj\") pod \"auto-csr-approver-29563844-j7kb9\" (UID: \"2098dac3-962e-4d75-b22f-81aadc768dc6\") " pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.518437 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.987957 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-j7kb9"] Mar 18 10:44:00 crc kubenswrapper[4778]: I0318 10:44:00.990294 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:44:01 crc kubenswrapper[4778]: I0318 10:44:01.219770 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" event={"ID":"2098dac3-962e-4d75-b22f-81aadc768dc6","Type":"ContainerStarted","Data":"ce2c426af705132d69bfde3949558052b8bdd7965ad71de8653d098877649038"} Mar 18 10:44:02 crc kubenswrapper[4778]: I0318 10:44:02.235837 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" event={"ID":"2098dac3-962e-4d75-b22f-81aadc768dc6","Type":"ContainerStarted","Data":"88b38e0fbddd0bbafde023ebaf3f6bdcd76dd7a995f8e2d8d9c48c114d683213"} Mar 18 10:44:02 crc kubenswrapper[4778]: I0318 10:44:02.269121 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" podStartSLOduration=1.394289291 podStartE2EDuration="2.269099394s" podCreationTimestamp="2026-03-18 10:44:00 +0000 UTC" firstStartedPulling="2026-03-18 10:44:00.989946416 +0000 UTC m=+6107.564691266" lastFinishedPulling="2026-03-18 10:44:01.864756529 +0000 UTC m=+6108.439501369" observedRunningTime="2026-03-18 10:44:02.260695586 +0000 UTC m=+6108.835440466" watchObservedRunningTime="2026-03-18 10:44:02.269099394 +0000 UTC m=+6108.843844234" Mar 18 10:44:03 crc kubenswrapper[4778]: I0318 10:44:03.187074 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:44:03 crc kubenswrapper[4778]: I0318 10:44:03.325635 4778 generic.go:334] "Generic (PLEG): container finished" podID="2098dac3-962e-4d75-b22f-81aadc768dc6" containerID="88b38e0fbddd0bbafde023ebaf3f6bdcd76dd7a995f8e2d8d9c48c114d683213" exitCode=0 Mar 18 10:44:03 crc kubenswrapper[4778]: I0318 10:44:03.325692 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" event={"ID":"2098dac3-962e-4d75-b22f-81aadc768dc6","Type":"ContainerDied","Data":"88b38e0fbddd0bbafde023ebaf3f6bdcd76dd7a995f8e2d8d9c48c114d683213"} Mar 18 10:44:04 crc kubenswrapper[4778]: I0318 10:44:04.337132 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852"} Mar 18 10:44:04 crc kubenswrapper[4778]: I0318 10:44:04.864095 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:04 crc kubenswrapper[4778]: I0318 10:44:04.948883 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z8sj\" (UniqueName: \"kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj\") pod \"2098dac3-962e-4d75-b22f-81aadc768dc6\" (UID: \"2098dac3-962e-4d75-b22f-81aadc768dc6\") " Mar 18 10:44:04 crc kubenswrapper[4778]: I0318 10:44:04.955663 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj" (OuterVolumeSpecName: "kube-api-access-5z8sj") pod "2098dac3-962e-4d75-b22f-81aadc768dc6" (UID: "2098dac3-962e-4d75-b22f-81aadc768dc6"). InnerVolumeSpecName "kube-api-access-5z8sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.051598 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z8sj\" (UniqueName: \"kubernetes.io/projected/2098dac3-962e-4d75-b22f-81aadc768dc6-kube-api-access-5z8sj\") on node \"crc\" DevicePath \"\"" Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.333917 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-xhh59"] Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.342546 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-xhh59"] Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.347067 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" event={"ID":"2098dac3-962e-4d75-b22f-81aadc768dc6","Type":"ContainerDied","Data":"ce2c426af705132d69bfde3949558052b8bdd7965ad71de8653d098877649038"} Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.347113 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce2c426af705132d69bfde3949558052b8bdd7965ad71de8653d098877649038" Mar 18 10:44:05 crc kubenswrapper[4778]: I0318 10:44:05.347134 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-j7kb9" Mar 18 10:44:06 crc kubenswrapper[4778]: I0318 10:44:06.196945 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06d0e7b4-0fff-4364-bef0-a408acdbcdbb" path="/var/lib/kubelet/pods/06d0e7b4-0fff-4364-bef0-a408acdbcdbb/volumes" Mar 18 10:44:36 crc kubenswrapper[4778]: I0318 10:44:36.679184 4778 scope.go:117] "RemoveContainer" containerID="646650d3ca5a64cb0621f10e2cff5ed5ebd691e96f89d98ddbd2ca48d396fddb" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.171068 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct"] Mar 18 10:45:00 crc kubenswrapper[4778]: E0318 10:45:00.172215 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2098dac3-962e-4d75-b22f-81aadc768dc6" containerName="oc" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.172232 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2098dac3-962e-4d75-b22f-81aadc768dc6" containerName="oc" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.172480 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2098dac3-962e-4d75-b22f-81aadc768dc6" containerName="oc" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.173252 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.177157 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.177460 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.205308 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct"] Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.357084 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.357334 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbf52\" (UniqueName: \"kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.357653 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.461068 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbf52\" (UniqueName: \"kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.461169 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.461304 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.463039 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.476410 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.490816 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbf52\" (UniqueName: \"kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52\") pod \"collect-profiles-29563845-s56ct\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.503539 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.966876 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct"] Mar 18 10:45:00 crc kubenswrapper[4778]: I0318 10:45:00.990532 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" event={"ID":"b11278a5-f162-4ea2-abf3-dd1176b7ef1f","Type":"ContainerStarted","Data":"d67285ff0e86fa649bded598eaf80fa023059cdc1cd55d6ed0139326637ce920"} Mar 18 10:45:02 crc kubenswrapper[4778]: I0318 10:45:02.000395 4778 generic.go:334] "Generic (PLEG): container finished" podID="b11278a5-f162-4ea2-abf3-dd1176b7ef1f" containerID="52074f34225e99f56da69aa3b63f9abb2e62cdce5a1ef7447dd39a3f7432ef79" exitCode=0 Mar 18 10:45:02 crc kubenswrapper[4778]: I0318 10:45:02.000519 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" event={"ID":"b11278a5-f162-4ea2-abf3-dd1176b7ef1f","Type":"ContainerDied","Data":"52074f34225e99f56da69aa3b63f9abb2e62cdce5a1ef7447dd39a3f7432ef79"} Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.474077 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.629571 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume\") pod \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.629946 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume\") pod \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.630038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbf52\" (UniqueName: \"kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52\") pod \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\" (UID: \"b11278a5-f162-4ea2-abf3-dd1176b7ef1f\") " Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.630566 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "b11278a5-f162-4ea2-abf3-dd1176b7ef1f" (UID: "b11278a5-f162-4ea2-abf3-dd1176b7ef1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.630899 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.635903 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b11278a5-f162-4ea2-abf3-dd1176b7ef1f" (UID: "b11278a5-f162-4ea2-abf3-dd1176b7ef1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.636380 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52" (OuterVolumeSpecName: "kube-api-access-xbf52") pod "b11278a5-f162-4ea2-abf3-dd1176b7ef1f" (UID: "b11278a5-f162-4ea2-abf3-dd1176b7ef1f"). InnerVolumeSpecName "kube-api-access-xbf52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.731873 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbf52\" (UniqueName: \"kubernetes.io/projected/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-kube-api-access-xbf52\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:03 crc kubenswrapper[4778]: I0318 10:45:03.731910 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b11278a5-f162-4ea2-abf3-dd1176b7ef1f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:04 crc kubenswrapper[4778]: I0318 10:45:04.026236 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" event={"ID":"b11278a5-f162-4ea2-abf3-dd1176b7ef1f","Type":"ContainerDied","Data":"d67285ff0e86fa649bded598eaf80fa023059cdc1cd55d6ed0139326637ce920"} Mar 18 10:45:04 crc kubenswrapper[4778]: I0318 10:45:04.026301 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d67285ff0e86fa649bded598eaf80fa023059cdc1cd55d6ed0139326637ce920" Mar 18 10:45:04 crc kubenswrapper[4778]: I0318 10:45:04.026391 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-s56ct" Mar 18 10:45:04 crc kubenswrapper[4778]: I0318 10:45:04.565695 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl"] Mar 18 10:45:04 crc kubenswrapper[4778]: I0318 10:45:04.576906 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563800-2v8nl"] Mar 18 10:45:06 crc kubenswrapper[4778]: I0318 10:45:06.207160 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca9f1133-0fec-4eeb-8b9b-39148a035a92" path="/var/lib/kubelet/pods/ca9f1133-0fec-4eeb-8b9b-39148a035a92/volumes" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.623381 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:33 crc kubenswrapper[4778]: E0318 10:45:33.624782 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11278a5-f162-4ea2-abf3-dd1176b7ef1f" containerName="collect-profiles" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.624809 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11278a5-f162-4ea2-abf3-dd1176b7ef1f" containerName="collect-profiles" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.625146 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b11278a5-f162-4ea2-abf3-dd1176b7ef1f" containerName="collect-profiles" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.628055 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.635726 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.775253 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x4jn\" (UniqueName: \"kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.775468 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.775499 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.877533 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.877595 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.877657 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x4jn\" (UniqueName: \"kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.878273 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.878326 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.898749 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x4jn\" (UniqueName: \"kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn\") pod \"redhat-marketplace-v6wj7\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:33 crc kubenswrapper[4778]: I0318 10:45:33.961793 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:34 crc kubenswrapper[4778]: W0318 10:45:34.435703 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64c0c4b8_b1b6_4912_88c7_61be4fe4b899.slice/crio-f35f951c4605b5ffd52c3be40975f85b6fb2b61cc01c221181a9065aa7ef06fc WatchSource:0}: Error finding container f35f951c4605b5ffd52c3be40975f85b6fb2b61cc01c221181a9065aa7ef06fc: Status 404 returned error can't find the container with id f35f951c4605b5ffd52c3be40975f85b6fb2b61cc01c221181a9065aa7ef06fc Mar 18 10:45:34 crc kubenswrapper[4778]: I0318 10:45:34.439601 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:35 crc kubenswrapper[4778]: I0318 10:45:35.366890 4778 generic.go:334] "Generic (PLEG): container finished" podID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerID="2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44" exitCode=0 Mar 18 10:45:35 crc kubenswrapper[4778]: I0318 10:45:35.366962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerDied","Data":"2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44"} Mar 18 10:45:35 crc kubenswrapper[4778]: I0318 10:45:35.367539 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerStarted","Data":"f35f951c4605b5ffd52c3be40975f85b6fb2b61cc01c221181a9065aa7ef06fc"} Mar 18 10:45:36 crc kubenswrapper[4778]: I0318 10:45:36.378985 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerStarted","Data":"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3"} Mar 18 10:45:36 crc kubenswrapper[4778]: I0318 10:45:36.751957 4778 scope.go:117] "RemoveContainer" containerID="20eba30be4d8526eb64b11cc9e3c58803630e3554035c19c9650d8cecb2ebf82" Mar 18 10:45:37 crc kubenswrapper[4778]: I0318 10:45:37.386543 4778 generic.go:334] "Generic (PLEG): container finished" podID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerID="be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3" exitCode=0 Mar 18 10:45:37 crc kubenswrapper[4778]: I0318 10:45:37.386570 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerDied","Data":"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3"} Mar 18 10:45:38 crc kubenswrapper[4778]: I0318 10:45:38.401667 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerStarted","Data":"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810"} Mar 18 10:45:38 crc kubenswrapper[4778]: I0318 10:45:38.427170 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v6wj7" podStartSLOduration=2.991236952 podStartE2EDuration="5.427156829s" podCreationTimestamp="2026-03-18 10:45:33 +0000 UTC" firstStartedPulling="2026-03-18 10:45:35.369626895 +0000 UTC m=+6201.944371735" lastFinishedPulling="2026-03-18 10:45:37.805546762 +0000 UTC m=+6204.380291612" observedRunningTime="2026-03-18 10:45:38.418625128 +0000 UTC m=+6204.993369978" watchObservedRunningTime="2026-03-18 10:45:38.427156829 +0000 UTC m=+6205.001901659" Mar 18 10:45:43 crc kubenswrapper[4778]: I0318 10:45:43.962927 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:43 crc kubenswrapper[4778]: I0318 10:45:43.963599 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:44 crc kubenswrapper[4778]: I0318 10:45:44.008642 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:44 crc kubenswrapper[4778]: I0318 10:45:44.514650 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:44 crc kubenswrapper[4778]: I0318 10:45:44.578380 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:46 crc kubenswrapper[4778]: I0318 10:45:46.485152 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v6wj7" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="registry-server" containerID="cri-o://7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810" gracePeriod=2 Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.434313 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.495470 4778 generic.go:334] "Generic (PLEG): container finished" podID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerID="7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810" exitCode=0 Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.495500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerDied","Data":"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810"} Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.495533 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6wj7" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.495551 4778 scope.go:117] "RemoveContainer" containerID="7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.495540 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6wj7" event={"ID":"64c0c4b8-b1b6-4912-88c7-61be4fe4b899","Type":"ContainerDied","Data":"f35f951c4605b5ffd52c3be40975f85b6fb2b61cc01c221181a9065aa7ef06fc"} Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.513977 4778 scope.go:117] "RemoveContainer" containerID="be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.534345 4778 scope.go:117] "RemoveContainer" containerID="2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.590059 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities\") pod \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.590225 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x4jn\" (UniqueName: \"kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn\") pod \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.590298 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content\") pod \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\" (UID: \"64c0c4b8-b1b6-4912-88c7-61be4fe4b899\") " Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.591442 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities" (OuterVolumeSpecName: "utilities") pod "64c0c4b8-b1b6-4912-88c7-61be4fe4b899" (UID: "64c0c4b8-b1b6-4912-88c7-61be4fe4b899"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.594490 4778 scope.go:117] "RemoveContainer" containerID="7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810" Mar 18 10:45:47 crc kubenswrapper[4778]: E0318 10:45:47.595049 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810\": container with ID starting with 7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810 not found: ID does not exist" containerID="7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.595098 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810"} err="failed to get container status \"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810\": rpc error: code = NotFound desc = could not find container \"7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810\": container with ID starting with 7bc510708e0980338c6ab0691a0d1dd7954fcda0f60677f20c2f8eaf98463810 not found: ID does not exist" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.595127 4778 scope.go:117] "RemoveContainer" containerID="be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3" Mar 18 10:45:47 crc kubenswrapper[4778]: E0318 10:45:47.595503 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3\": container with ID starting with be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3 not found: ID does not exist" containerID="be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.595546 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3"} err="failed to get container status \"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3\": rpc error: code = NotFound desc = could not find container \"be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3\": container with ID starting with be80897685ea21f954c4b045dcf01ba346624ef6567adee8be169c9fa1dc92f3 not found: ID does not exist" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.595561 4778 scope.go:117] "RemoveContainer" containerID="2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44" Mar 18 10:45:47 crc kubenswrapper[4778]: E0318 10:45:47.596260 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44\": container with ID starting with 2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44 not found: ID does not exist" containerID="2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.596307 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44"} err="failed to get container status \"2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44\": rpc error: code = NotFound desc = could not find container \"2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44\": container with ID starting with 2b5e465563f7ec910954e772a2b785eda7e5624337ef1a1f1a006082dc773d44 not found: ID does not exist" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.598057 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn" (OuterVolumeSpecName: "kube-api-access-4x4jn") pod "64c0c4b8-b1b6-4912-88c7-61be4fe4b899" (UID: "64c0c4b8-b1b6-4912-88c7-61be4fe4b899"). InnerVolumeSpecName "kube-api-access-4x4jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.615562 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64c0c4b8-b1b6-4912-88c7-61be4fe4b899" (UID: "64c0c4b8-b1b6-4912-88c7-61be4fe4b899"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.692609 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x4jn\" (UniqueName: \"kubernetes.io/projected/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-kube-api-access-4x4jn\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.692647 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.692661 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c0c4b8-b1b6-4912-88c7-61be4fe4b899-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.848954 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:47 crc kubenswrapper[4778]: I0318 10:45:47.860675 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6wj7"] Mar 18 10:45:48 crc kubenswrapper[4778]: I0318 10:45:48.211391 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" path="/var/lib/kubelet/pods/64c0c4b8-b1b6-4912-88c7-61be4fe4b899/volumes" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.182152 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563846-jwrl9"] Mar 18 10:46:00 crc kubenswrapper[4778]: E0318 10:46:00.183114 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="extract-content" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.183131 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="extract-content" Mar 18 10:46:00 crc kubenswrapper[4778]: E0318 10:46:00.183156 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="extract-utilities" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.183163 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="extract-utilities" Mar 18 10:46:00 crc kubenswrapper[4778]: E0318 10:46:00.183190 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="registry-server" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.183256 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="registry-server" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.183516 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c0c4b8-b1b6-4912-88c7-61be4fe4b899" containerName="registry-server" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.184318 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.187360 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.187418 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.187924 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.200756 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-jwrl9"] Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.359416 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slpdj\" (UniqueName: \"kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj\") pod \"auto-csr-approver-29563846-jwrl9\" (UID: \"2ed10f0a-3d2d-483e-9532-dd1f7b38631b\") " pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.461889 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slpdj\" (UniqueName: \"kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj\") pod \"auto-csr-approver-29563846-jwrl9\" (UID: \"2ed10f0a-3d2d-483e-9532-dd1f7b38631b\") " pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.486871 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slpdj\" (UniqueName: \"kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj\") pod \"auto-csr-approver-29563846-jwrl9\" (UID: \"2ed10f0a-3d2d-483e-9532-dd1f7b38631b\") " pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.504944 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:00 crc kubenswrapper[4778]: I0318 10:46:00.969949 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-jwrl9"] Mar 18 10:46:01 crc kubenswrapper[4778]: I0318 10:46:01.626112 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" event={"ID":"2ed10f0a-3d2d-483e-9532-dd1f7b38631b","Type":"ContainerStarted","Data":"2eb014649fd8d480769737cbee9ea9c54962845af511b22a4fdca375f6c6cb50"} Mar 18 10:46:02 crc kubenswrapper[4778]: I0318 10:46:02.636940 4778 generic.go:334] "Generic (PLEG): container finished" podID="2ed10f0a-3d2d-483e-9532-dd1f7b38631b" containerID="a51630f7ff38c957b6d8be33f92679338164d3fd19d236304cf23699728f1e4b" exitCode=0 Mar 18 10:46:02 crc kubenswrapper[4778]: I0318 10:46:02.637012 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" event={"ID":"2ed10f0a-3d2d-483e-9532-dd1f7b38631b","Type":"ContainerDied","Data":"a51630f7ff38c957b6d8be33f92679338164d3fd19d236304cf23699728f1e4b"} Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.028107 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.139102 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slpdj\" (UniqueName: \"kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj\") pod \"2ed10f0a-3d2d-483e-9532-dd1f7b38631b\" (UID: \"2ed10f0a-3d2d-483e-9532-dd1f7b38631b\") " Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.161532 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj" (OuterVolumeSpecName: "kube-api-access-slpdj") pod "2ed10f0a-3d2d-483e-9532-dd1f7b38631b" (UID: "2ed10f0a-3d2d-483e-9532-dd1f7b38631b"). InnerVolumeSpecName "kube-api-access-slpdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.242220 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slpdj\" (UniqueName: \"kubernetes.io/projected/2ed10f0a-3d2d-483e-9532-dd1f7b38631b-kube-api-access-slpdj\") on node \"crc\" DevicePath \"\"" Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.661742 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" event={"ID":"2ed10f0a-3d2d-483e-9532-dd1f7b38631b","Type":"ContainerDied","Data":"2eb014649fd8d480769737cbee9ea9c54962845af511b22a4fdca375f6c6cb50"} Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.661782 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eb014649fd8d480769737cbee9ea9c54962845af511b22a4fdca375f6c6cb50" Mar 18 10:46:04 crc kubenswrapper[4778]: I0318 10:46:04.662289 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-jwrl9" Mar 18 10:46:05 crc kubenswrapper[4778]: I0318 10:46:05.108626 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-fv5mm"] Mar 18 10:46:05 crc kubenswrapper[4778]: I0318 10:46:05.119449 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-fv5mm"] Mar 18 10:46:06 crc kubenswrapper[4778]: I0318 10:46:06.202389 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc3d0dd7-380a-4f1c-bb78-f8df1a73362e" path="/var/lib/kubelet/pods/dc3d0dd7-380a-4f1c-bb78-f8df1a73362e/volumes" Mar 18 10:46:30 crc kubenswrapper[4778]: I0318 10:46:30.147560 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:46:30 crc kubenswrapper[4778]: I0318 10:46:30.148225 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:46:36 crc kubenswrapper[4778]: I0318 10:46:36.807735 4778 scope.go:117] "RemoveContainer" containerID="105700f78835bb2b225e76573d66e982ecb74a475715ed5f6fa69ca1e19eafce" Mar 18 10:47:00 crc kubenswrapper[4778]: I0318 10:47:00.147116 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:47:00 crc kubenswrapper[4778]: I0318 10:47:00.147865 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.147603 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.148509 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.148591 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.149935 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.150054 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852" gracePeriod=600 Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.633867 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852" exitCode=0 Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.634239 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852"} Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.634279 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa"} Mar 18 10:47:30 crc kubenswrapper[4778]: I0318 10:47:30.634299 4778 scope.go:117] "RemoveContainer" containerID="b27a64502a0fe50c7c128e74dacfa68178e374be599d146d9c3b76abe61e0fa5" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.055762 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:33 crc kubenswrapper[4778]: E0318 10:47:33.056804 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed10f0a-3d2d-483e-9532-dd1f7b38631b" containerName="oc" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.056822 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed10f0a-3d2d-483e-9532-dd1f7b38631b" containerName="oc" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.057081 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed10f0a-3d2d-483e-9532-dd1f7b38631b" containerName="oc" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.059161 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.084104 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.225140 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvlt8\" (UniqueName: \"kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.225625 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.225768 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.327604 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.327707 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.327736 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvlt8\" (UniqueName: \"kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.328381 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.328540 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.352049 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvlt8\" (UniqueName: \"kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8\") pod \"certified-operators-9z6mr\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.393350 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:33 crc kubenswrapper[4778]: I0318 10:47:33.953908 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:34 crc kubenswrapper[4778]: I0318 10:47:34.680937 4778 generic.go:334] "Generic (PLEG): container finished" podID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerID="87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5" exitCode=0 Mar 18 10:47:34 crc kubenswrapper[4778]: I0318 10:47:34.681280 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerDied","Data":"87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5"} Mar 18 10:47:34 crc kubenswrapper[4778]: I0318 10:47:34.681308 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerStarted","Data":"bd6bd7135302f1dcdf1cf4e0ba01837911d65c1b90f886c2591d77d03440840e"} Mar 18 10:47:35 crc kubenswrapper[4778]: I0318 10:47:35.691041 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerStarted","Data":"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064"} Mar 18 10:47:35 crc kubenswrapper[4778]: I0318 10:47:35.694116 4778 generic.go:334] "Generic (PLEG): container finished" podID="757e3758-d646-4267-8c4c-b5efb0dcf709" containerID="c559ae3a1e4423e99c37d72f15f18f3cd16bc2838d62270df411dbac2afa6c1e" exitCode=0 Mar 18 10:47:35 crc kubenswrapper[4778]: I0318 10:47:35.694168 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"757e3758-d646-4267-8c4c-b5efb0dcf709","Type":"ContainerDied","Data":"c559ae3a1e4423e99c37d72f15f18f3cd16bc2838d62270df411dbac2afa6c1e"} Mar 18 10:47:36 crc kubenswrapper[4778]: I0318 10:47:36.706465 4778 generic.go:334] "Generic (PLEG): container finished" podID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerID="2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064" exitCode=0 Mar 18 10:47:36 crc kubenswrapper[4778]: I0318 10:47:36.706570 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerDied","Data":"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064"} Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.542442 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.632144 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Mar 18 10:47:37 crc kubenswrapper[4778]: E0318 10:47:37.646415 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757e3758-d646-4267-8c4c-b5efb0dcf709" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.646536 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="757e3758-d646-4267-8c4c-b5efb0dcf709" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.649187 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="757e3758-d646-4267-8c4c-b5efb0dcf709" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.654030 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.670341 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s1" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.670680 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s1" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.671223 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.716419 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"757e3758-d646-4267-8c4c-b5efb0dcf709","Type":"ContainerDied","Data":"dcac98cd78d62b2f03dd429a022d38c29d36c13fc170b830ecbd627ba6023d27"} Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.716474 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcac98cd78d62b2f03dd429a022d38c29d36c13fc170b830ecbd627ba6023d27" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.716731 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717290 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717389 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717505 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717597 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717615 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717664 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717703 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c89k6\" (UniqueName: \"kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717719 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717740 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.717756 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph\") pod \"757e3758-d646-4267-8c4c-b5efb0dcf709\" (UID: \"757e3758-d646-4267-8c4c-b5efb0dcf709\") " Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.720856 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data" (OuterVolumeSpecName: "config-data") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.721076 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerStarted","Data":"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142"} Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.721636 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.728538 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.731688 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.738433 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6" (OuterVolumeSpecName: "kube-api-access-c89k6") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "kube-api-access-c89k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.739237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph" (OuterVolumeSpecName: "ceph") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.750143 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.757119 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.772041 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.774914 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "757e3758-d646-4267-8c4c-b5efb0dcf709" (UID: "757e3758-d646-4267-8c4c-b5efb0dcf709"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819491 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx57k\" (UniqueName: \"kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819535 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819570 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819756 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819866 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.819906 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820017 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820069 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820104 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820142 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820404 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c89k6\" (UniqueName: \"kubernetes.io/projected/757e3758-d646-4267-8c4c-b5efb0dcf709-kube-api-access-c89k6\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820482 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820505 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820521 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820537 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820547 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820556 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820565 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/757e3758-d646-4267-8c4c-b5efb0dcf709-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.820575 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/757e3758-d646-4267-8c4c-b5efb0dcf709-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.848656 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922006 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922050 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922132 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922168 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922220 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922261 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx57k\" (UniqueName: \"kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922277 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922303 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922335 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.922968 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.923300 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.923860 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.924082 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.926344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.926748 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.927215 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.929077 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.938010 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx57k\" (UniqueName: \"kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:37 crc kubenswrapper[4778]: I0318 10:47:37.991162 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:47:38 crc kubenswrapper[4778]: I0318 10:47:38.544023 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9z6mr" podStartSLOduration=2.943335798 podStartE2EDuration="5.544001582s" podCreationTimestamp="2026-03-18 10:47:33 +0000 UTC" firstStartedPulling="2026-03-18 10:47:34.683316296 +0000 UTC m=+6321.258061136" lastFinishedPulling="2026-03-18 10:47:37.28398207 +0000 UTC m=+6323.858726920" observedRunningTime="2026-03-18 10:47:37.761790391 +0000 UTC m=+6324.336535241" watchObservedRunningTime="2026-03-18 10:47:38.544001582 +0000 UTC m=+6325.118746422" Mar 18 10:47:38 crc kubenswrapper[4778]: I0318 10:47:38.547224 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Mar 18 10:47:38 crc kubenswrapper[4778]: I0318 10:47:38.730843 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"c5a7a532-f8c2-4741-9892-65047a4cb225","Type":"ContainerStarted","Data":"1ee8cf024ce4398d1f0ed48d786bbb6b3add9e2f95a7fd4bf27b0fad0caf4251"} Mar 18 10:47:39 crc kubenswrapper[4778]: I0318 10:47:39.741232 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"c5a7a532-f8c2-4741-9892-65047a4cb225","Type":"ContainerStarted","Data":"44ebfbf33b960c39e1ffc52c3185dc0dc1ec7c33f6f6b2ba0c1b6ca80065a482"} Mar 18 10:47:39 crc kubenswrapper[4778]: I0318 10:47:39.768090 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s01-single-test" podStartSLOduration=2.768072983 podStartE2EDuration="2.768072983s" podCreationTimestamp="2026-03-18 10:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:47:39.764717282 +0000 UTC m=+6326.339462142" watchObservedRunningTime="2026-03-18 10:47:39.768072983 +0000 UTC m=+6326.342817823" Mar 18 10:47:43 crc kubenswrapper[4778]: I0318 10:47:43.393754 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:43 crc kubenswrapper[4778]: I0318 10:47:43.395962 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:43 crc kubenswrapper[4778]: I0318 10:47:43.454483 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:43 crc kubenswrapper[4778]: I0318 10:47:43.834777 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:43 crc kubenswrapper[4778]: I0318 10:47:43.890120 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:45 crc kubenswrapper[4778]: I0318 10:47:45.807937 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9z6mr" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="registry-server" containerID="cri-o://872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142" gracePeriod=2 Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.783763 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.825751 4778 generic.go:334] "Generic (PLEG): container finished" podID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerID="872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142" exitCode=0 Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.826174 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerDied","Data":"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142"} Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.826238 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9z6mr" event={"ID":"6bd497db-2065-4c53-8a1e-1499f18fb717","Type":"ContainerDied","Data":"bd6bd7135302f1dcdf1cf4e0ba01837911d65c1b90f886c2591d77d03440840e"} Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.826264 4778 scope.go:117] "RemoveContainer" containerID="872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.826330 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9z6mr" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.867693 4778 scope.go:117] "RemoveContainer" containerID="2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.911677 4778 scope.go:117] "RemoveContainer" containerID="87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.915859 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities\") pod \"6bd497db-2065-4c53-8a1e-1499f18fb717\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.915963 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content\") pod \"6bd497db-2065-4c53-8a1e-1499f18fb717\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.916159 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvlt8\" (UniqueName: \"kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8\") pod \"6bd497db-2065-4c53-8a1e-1499f18fb717\" (UID: \"6bd497db-2065-4c53-8a1e-1499f18fb717\") " Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.918083 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities" (OuterVolumeSpecName: "utilities") pod "6bd497db-2065-4c53-8a1e-1499f18fb717" (UID: "6bd497db-2065-4c53-8a1e-1499f18fb717"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.922130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8" (OuterVolumeSpecName: "kube-api-access-rvlt8") pod "6bd497db-2065-4c53-8a1e-1499f18fb717" (UID: "6bd497db-2065-4c53-8a1e-1499f18fb717"). InnerVolumeSpecName "kube-api-access-rvlt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:47:46 crc kubenswrapper[4778]: I0318 10:47:46.974573 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bd497db-2065-4c53-8a1e-1499f18fb717" (UID: "6bd497db-2065-4c53-8a1e-1499f18fb717"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.005543 4778 scope.go:117] "RemoveContainer" containerID="872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142" Mar 18 10:47:47 crc kubenswrapper[4778]: E0318 10:47:47.005886 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142\": container with ID starting with 872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142 not found: ID does not exist" containerID="872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.005940 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142"} err="failed to get container status \"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142\": rpc error: code = NotFound desc = could not find container \"872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142\": container with ID starting with 872e0b1de820c983d2f9c82d16076600a5833a7d89bd19a788402c3b62d2f142 not found: ID does not exist" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.005964 4778 scope.go:117] "RemoveContainer" containerID="2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064" Mar 18 10:47:47 crc kubenswrapper[4778]: E0318 10:47:47.006414 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064\": container with ID starting with 2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064 not found: ID does not exist" containerID="2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.006472 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064"} err="failed to get container status \"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064\": rpc error: code = NotFound desc = could not find container \"2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064\": container with ID starting with 2e948e69c53702790d51f6f0689df107eeaff9a9e59eccbd96b5562474ef2064 not found: ID does not exist" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.006504 4778 scope.go:117] "RemoveContainer" containerID="87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5" Mar 18 10:47:47 crc kubenswrapper[4778]: E0318 10:47:47.006792 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5\": container with ID starting with 87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5 not found: ID does not exist" containerID="87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.006814 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5"} err="failed to get container status \"87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5\": rpc error: code = NotFound desc = could not find container \"87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5\": container with ID starting with 87b678c7a077ca1d39c7cc44dea6816882d37033c786db0a661443484b0219c5 not found: ID does not exist" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.019962 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.019987 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd497db-2065-4c53-8a1e-1499f18fb717-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.019997 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvlt8\" (UniqueName: \"kubernetes.io/projected/6bd497db-2065-4c53-8a1e-1499f18fb717-kube-api-access-rvlt8\") on node \"crc\" DevicePath \"\"" Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.161629 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:47 crc kubenswrapper[4778]: I0318 10:47:47.171056 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9z6mr"] Mar 18 10:47:48 crc kubenswrapper[4778]: I0318 10:47:48.206689 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" path="/var/lib/kubelet/pods/6bd497db-2065-4c53-8a1e-1499f18fb717/volumes" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.148569 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563848-f28cg"] Mar 18 10:48:00 crc kubenswrapper[4778]: E0318 10:48:00.149290 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="registry-server" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.149302 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="registry-server" Mar 18 10:48:00 crc kubenswrapper[4778]: E0318 10:48:00.149328 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="extract-content" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.149335 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="extract-content" Mar 18 10:48:00 crc kubenswrapper[4778]: E0318 10:48:00.149355 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="extract-utilities" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.149361 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="extract-utilities" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.149529 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd497db-2065-4c53-8a1e-1499f18fb717" containerName="registry-server" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.150102 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.170039 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.170560 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.170992 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.174503 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-f28cg"] Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.294073 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g7w4\" (UniqueName: \"kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4\") pod \"auto-csr-approver-29563848-f28cg\" (UID: \"49327474-2bad-4ebc-b955-bf9dd1268c5e\") " pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.397267 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g7w4\" (UniqueName: \"kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4\") pod \"auto-csr-approver-29563848-f28cg\" (UID: \"49327474-2bad-4ebc-b955-bf9dd1268c5e\") " pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.432242 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g7w4\" (UniqueName: \"kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4\") pod \"auto-csr-approver-29563848-f28cg\" (UID: \"49327474-2bad-4ebc-b955-bf9dd1268c5e\") " pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:00 crc kubenswrapper[4778]: I0318 10:48:00.467963 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:01 crc kubenswrapper[4778]: I0318 10:48:01.006901 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-f28cg"] Mar 18 10:48:01 crc kubenswrapper[4778]: I0318 10:48:01.983125 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563848-f28cg" event={"ID":"49327474-2bad-4ebc-b955-bf9dd1268c5e","Type":"ContainerStarted","Data":"9dd2005bf3609ca5c74b14f29aa73689791f9e708f340d3d96513f2c3aef7ae9"} Mar 18 10:48:02 crc kubenswrapper[4778]: I0318 10:48:02.993333 4778 generic.go:334] "Generic (PLEG): container finished" podID="49327474-2bad-4ebc-b955-bf9dd1268c5e" containerID="f8a6db8312e875033fdd1f73d7983e85a274f3fbde6864a2faaec123c194e5c8" exitCode=0 Mar 18 10:48:02 crc kubenswrapper[4778]: I0318 10:48:02.993402 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563848-f28cg" event={"ID":"49327474-2bad-4ebc-b955-bf9dd1268c5e","Type":"ContainerDied","Data":"f8a6db8312e875033fdd1f73d7983e85a274f3fbde6864a2faaec123c194e5c8"} Mar 18 10:48:04 crc kubenswrapper[4778]: I0318 10:48:04.380538 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:04 crc kubenswrapper[4778]: I0318 10:48:04.486165 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g7w4\" (UniqueName: \"kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4\") pod \"49327474-2bad-4ebc-b955-bf9dd1268c5e\" (UID: \"49327474-2bad-4ebc-b955-bf9dd1268c5e\") " Mar 18 10:48:04 crc kubenswrapper[4778]: I0318 10:48:04.500345 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4" (OuterVolumeSpecName: "kube-api-access-2g7w4") pod "49327474-2bad-4ebc-b955-bf9dd1268c5e" (UID: "49327474-2bad-4ebc-b955-bf9dd1268c5e"). InnerVolumeSpecName "kube-api-access-2g7w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:48:04 crc kubenswrapper[4778]: I0318 10:48:04.590692 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g7w4\" (UniqueName: \"kubernetes.io/projected/49327474-2bad-4ebc-b955-bf9dd1268c5e-kube-api-access-2g7w4\") on node \"crc\" DevicePath \"\"" Mar 18 10:48:05 crc kubenswrapper[4778]: I0318 10:48:05.017268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563848-f28cg" event={"ID":"49327474-2bad-4ebc-b955-bf9dd1268c5e","Type":"ContainerDied","Data":"9dd2005bf3609ca5c74b14f29aa73689791f9e708f340d3d96513f2c3aef7ae9"} Mar 18 10:48:05 crc kubenswrapper[4778]: I0318 10:48:05.017576 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dd2005bf3609ca5c74b14f29aa73689791f9e708f340d3d96513f2c3aef7ae9" Mar 18 10:48:05 crc kubenswrapper[4778]: I0318 10:48:05.017369 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-f28cg" Mar 18 10:48:05 crc kubenswrapper[4778]: I0318 10:48:05.456677 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-ckkmp"] Mar 18 10:48:05 crc kubenswrapper[4778]: I0318 10:48:05.464025 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-ckkmp"] Mar 18 10:48:06 crc kubenswrapper[4778]: I0318 10:48:06.198711 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d10faaed-ffef-4afb-9f75-262e4fccd22a" path="/var/lib/kubelet/pods/d10faaed-ffef-4afb-9f75-262e4fccd22a/volumes" Mar 18 10:48:36 crc kubenswrapper[4778]: I0318 10:48:36.914648 4778 scope.go:117] "RemoveContainer" containerID="cec3fa048e9699e703eb8a3404384f6f46bb6a98f37648f4a97cf2fe11dab009" Mar 18 10:49:05 crc kubenswrapper[4778]: I0318 10:49:05.988538 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:05 crc kubenswrapper[4778]: E0318 10:49:05.989706 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49327474-2bad-4ebc-b955-bf9dd1268c5e" containerName="oc" Mar 18 10:49:05 crc kubenswrapper[4778]: I0318 10:49:05.989731 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="49327474-2bad-4ebc-b955-bf9dd1268c5e" containerName="oc" Mar 18 10:49:05 crc kubenswrapper[4778]: I0318 10:49:05.990063 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="49327474-2bad-4ebc-b955-bf9dd1268c5e" containerName="oc" Mar 18 10:49:05 crc kubenswrapper[4778]: I0318 10:49:05.992486 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.011131 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.044367 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.044873 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jspvx\" (UniqueName: \"kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.044938 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.146345 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jspvx\" (UniqueName: \"kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.146387 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.146423 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.147025 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.147066 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.170168 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jspvx\" (UniqueName: \"kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx\") pod \"redhat-operators-pgmtp\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.331950 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:06 crc kubenswrapper[4778]: I0318 10:49:06.790291 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:07 crc kubenswrapper[4778]: I0318 10:49:07.779185 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerID="66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9" exitCode=0 Mar 18 10:49:07 crc kubenswrapper[4778]: I0318 10:49:07.779268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerDied","Data":"66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9"} Mar 18 10:49:07 crc kubenswrapper[4778]: I0318 10:49:07.782475 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerStarted","Data":"1aec8bdc1619f777aa2c0574af940f8616b13917329f5392e60aba7ef8b31165"} Mar 18 10:49:07 crc kubenswrapper[4778]: I0318 10:49:07.781652 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:49:09 crc kubenswrapper[4778]: I0318 10:49:09.806467 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerStarted","Data":"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50"} Mar 18 10:49:10 crc kubenswrapper[4778]: I0318 10:49:10.819991 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerID="9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50" exitCode=0 Mar 18 10:49:10 crc kubenswrapper[4778]: I0318 10:49:10.820053 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerDied","Data":"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50"} Mar 18 10:49:11 crc kubenswrapper[4778]: I0318 10:49:11.833953 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerStarted","Data":"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b"} Mar 18 10:49:11 crc kubenswrapper[4778]: I0318 10:49:11.863711 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pgmtp" podStartSLOduration=3.290509345 podStartE2EDuration="6.863688217s" podCreationTimestamp="2026-03-18 10:49:05 +0000 UTC" firstStartedPulling="2026-03-18 10:49:07.781374108 +0000 UTC m=+6414.356118958" lastFinishedPulling="2026-03-18 10:49:11.35455299 +0000 UTC m=+6417.929297830" observedRunningTime="2026-03-18 10:49:11.858090246 +0000 UTC m=+6418.432835076" watchObservedRunningTime="2026-03-18 10:49:11.863688217 +0000 UTC m=+6418.438433067" Mar 18 10:49:16 crc kubenswrapper[4778]: I0318 10:49:16.333105 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:16 crc kubenswrapper[4778]: I0318 10:49:16.333689 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:17 crc kubenswrapper[4778]: I0318 10:49:17.391295 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pgmtp" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="registry-server" probeResult="failure" output=< Mar 18 10:49:17 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:49:17 crc kubenswrapper[4778]: > Mar 18 10:49:26 crc kubenswrapper[4778]: I0318 10:49:26.402249 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:26 crc kubenswrapper[4778]: I0318 10:49:26.507418 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:26 crc kubenswrapper[4778]: I0318 10:49:26.652627 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:27 crc kubenswrapper[4778]: I0318 10:49:27.989664 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pgmtp" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="registry-server" containerID="cri-o://ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b" gracePeriod=2 Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.531613 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.638251 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities\") pod \"0cdec5ae-a923-4018-9a0b-400916a4273f\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.638374 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content\") pod \"0cdec5ae-a923-4018-9a0b-400916a4273f\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.638613 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jspvx\" (UniqueName: \"kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx\") pod \"0cdec5ae-a923-4018-9a0b-400916a4273f\" (UID: \"0cdec5ae-a923-4018-9a0b-400916a4273f\") " Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.639153 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities" (OuterVolumeSpecName: "utilities") pod "0cdec5ae-a923-4018-9a0b-400916a4273f" (UID: "0cdec5ae-a923-4018-9a0b-400916a4273f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.643943 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx" (OuterVolumeSpecName: "kube-api-access-jspvx") pod "0cdec5ae-a923-4018-9a0b-400916a4273f" (UID: "0cdec5ae-a923-4018-9a0b-400916a4273f"). InnerVolumeSpecName "kube-api-access-jspvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.740861 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jspvx\" (UniqueName: \"kubernetes.io/projected/0cdec5ae-a923-4018-9a0b-400916a4273f-kube-api-access-jspvx\") on node \"crc\" DevicePath \"\"" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.741109 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.799970 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cdec5ae-a923-4018-9a0b-400916a4273f" (UID: "0cdec5ae-a923-4018-9a0b-400916a4273f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:49:28 crc kubenswrapper[4778]: I0318 10:49:28.842558 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cdec5ae-a923-4018-9a0b-400916a4273f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.006406 4778 generic.go:334] "Generic (PLEG): container finished" podID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerID="ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b" exitCode=0 Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.006444 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerDied","Data":"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b"} Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.006473 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pgmtp" event={"ID":"0cdec5ae-a923-4018-9a0b-400916a4273f","Type":"ContainerDied","Data":"1aec8bdc1619f777aa2c0574af940f8616b13917329f5392e60aba7ef8b31165"} Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.006490 4778 scope.go:117] "RemoveContainer" containerID="ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.007401 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pgmtp" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.049990 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.051754 4778 scope.go:117] "RemoveContainer" containerID="9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.061057 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pgmtp"] Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.077956 4778 scope.go:117] "RemoveContainer" containerID="66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.136339 4778 scope.go:117] "RemoveContainer" containerID="ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b" Mar 18 10:49:29 crc kubenswrapper[4778]: E0318 10:49:29.136761 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b\": container with ID starting with ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b not found: ID does not exist" containerID="ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.136817 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b"} err="failed to get container status \"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b\": rpc error: code = NotFound desc = could not find container \"ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b\": container with ID starting with ab59f41e27cc012f3d06a8cc6b804124a371f529453d17f78514e207af8c3a0b not found: ID does not exist" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.136850 4778 scope.go:117] "RemoveContainer" containerID="9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50" Mar 18 10:49:29 crc kubenswrapper[4778]: E0318 10:49:29.137230 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50\": container with ID starting with 9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50 not found: ID does not exist" containerID="9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.137262 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50"} err="failed to get container status \"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50\": rpc error: code = NotFound desc = could not find container \"9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50\": container with ID starting with 9e245490522d2df11dde8c972afb270f5aa8175c32b64f35cf2e87d85f334a50 not found: ID does not exist" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.137300 4778 scope.go:117] "RemoveContainer" containerID="66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9" Mar 18 10:49:29 crc kubenswrapper[4778]: E0318 10:49:29.137551 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9\": container with ID starting with 66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9 not found: ID does not exist" containerID="66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9" Mar 18 10:49:29 crc kubenswrapper[4778]: I0318 10:49:29.137579 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9"} err="failed to get container status \"66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9\": rpc error: code = NotFound desc = could not find container \"66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9\": container with ID starting with 66fb44ae04997b57ed998c5ea8572af6eb05cac07977a9100b22d0162c2799e9 not found: ID does not exist" Mar 18 10:49:30 crc kubenswrapper[4778]: I0318 10:49:30.147418 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:49:30 crc kubenswrapper[4778]: I0318 10:49:30.147770 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:49:30 crc kubenswrapper[4778]: I0318 10:49:30.197396 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" path="/var/lib/kubelet/pods/0cdec5ae-a923-4018-9a0b-400916a4273f/volumes" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.147193 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563850-jrcnr"] Mar 18 10:50:00 crc kubenswrapper[4778]: E0318 10:50:00.148066 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="extract-content" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.148079 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="extract-content" Mar 18 10:50:00 crc kubenswrapper[4778]: E0318 10:50:00.148097 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="registry-server" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.148103 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="registry-server" Mar 18 10:50:00 crc kubenswrapper[4778]: E0318 10:50:00.148116 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="extract-utilities" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.148123 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="extract-utilities" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.148328 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cdec5ae-a923-4018-9a0b-400916a4273f" containerName="registry-server" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.149063 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.147653 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.149612 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.152937 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.154004 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.154262 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.169841 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-jrcnr"] Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.207504 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwmkn\" (UniqueName: \"kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn\") pod \"auto-csr-approver-29563850-jrcnr\" (UID: \"fc499d31-e373-413b-8a38-1fa69f007f2f\") " pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.310872 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwmkn\" (UniqueName: \"kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn\") pod \"auto-csr-approver-29563850-jrcnr\" (UID: \"fc499d31-e373-413b-8a38-1fa69f007f2f\") " pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.328830 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwmkn\" (UniqueName: \"kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn\") pod \"auto-csr-approver-29563850-jrcnr\" (UID: \"fc499d31-e373-413b-8a38-1fa69f007f2f\") " pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:00 crc kubenswrapper[4778]: I0318 10:50:00.472938 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:01 crc kubenswrapper[4778]: I0318 10:50:01.015985 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-jrcnr"] Mar 18 10:50:01 crc kubenswrapper[4778]: W0318 10:50:01.023414 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc499d31_e373_413b_8a38_1fa69f007f2f.slice/crio-3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3 WatchSource:0}: Error finding container 3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3: Status 404 returned error can't find the container with id 3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3 Mar 18 10:50:01 crc kubenswrapper[4778]: I0318 10:50:01.353766 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" event={"ID":"fc499d31-e373-413b-8a38-1fa69f007f2f","Type":"ContainerStarted","Data":"3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3"} Mar 18 10:50:03 crc kubenswrapper[4778]: I0318 10:50:03.378681 4778 generic.go:334] "Generic (PLEG): container finished" podID="fc499d31-e373-413b-8a38-1fa69f007f2f" containerID="f14aa13ab7520a46eb0a2d95277ca80a41dbb7bc18bd9145e42c1b01a855cabf" exitCode=0 Mar 18 10:50:03 crc kubenswrapper[4778]: I0318 10:50:03.378754 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" event={"ID":"fc499d31-e373-413b-8a38-1fa69f007f2f","Type":"ContainerDied","Data":"f14aa13ab7520a46eb0a2d95277ca80a41dbb7bc18bd9145e42c1b01a855cabf"} Mar 18 10:50:04 crc kubenswrapper[4778]: I0318 10:50:04.877453 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.071898 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwmkn\" (UniqueName: \"kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn\") pod \"fc499d31-e373-413b-8a38-1fa69f007f2f\" (UID: \"fc499d31-e373-413b-8a38-1fa69f007f2f\") " Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.079383 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn" (OuterVolumeSpecName: "kube-api-access-bwmkn") pod "fc499d31-e373-413b-8a38-1fa69f007f2f" (UID: "fc499d31-e373-413b-8a38-1fa69f007f2f"). InnerVolumeSpecName "kube-api-access-bwmkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.174170 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwmkn\" (UniqueName: \"kubernetes.io/projected/fc499d31-e373-413b-8a38-1fa69f007f2f-kube-api-access-bwmkn\") on node \"crc\" DevicePath \"\"" Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.403261 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" event={"ID":"fc499d31-e373-413b-8a38-1fa69f007f2f","Type":"ContainerDied","Data":"3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3"} Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.403314 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f2a027b89707a6aaf4202a17985fbf462d8b0b22415b14007ece0b2c1942cf3" Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.403392 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-jrcnr" Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.973777 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-j7kb9"] Mar 18 10:50:05 crc kubenswrapper[4778]: I0318 10:50:05.983256 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-j7kb9"] Mar 18 10:50:06 crc kubenswrapper[4778]: I0318 10:50:06.202971 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2098dac3-962e-4d75-b22f-81aadc768dc6" path="/var/lib/kubelet/pods/2098dac3-962e-4d75-b22f-81aadc768dc6/volumes" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.146956 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.147507 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.147558 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.148425 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.148495 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" gracePeriod=600 Mar 18 10:50:30 crc kubenswrapper[4778]: E0318 10:50:30.269274 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.677287 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" exitCode=0 Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.677363 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa"} Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.677509 4778 scope.go:117] "RemoveContainer" containerID="90db09640ef7e7d8376edb316416510275b2a613d72a3ab935ed1da863b06852" Mar 18 10:50:30 crc kubenswrapper[4778]: I0318 10:50:30.678890 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:50:30 crc kubenswrapper[4778]: E0318 10:50:30.679339 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:50:37 crc kubenswrapper[4778]: I0318 10:50:37.063244 4778 scope.go:117] "RemoveContainer" containerID="88b38e0fbddd0bbafde023ebaf3f6bdcd76dd7a995f8e2d8d9c48c114d683213" Mar 18 10:50:42 crc kubenswrapper[4778]: I0318 10:50:42.188069 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:50:42 crc kubenswrapper[4778]: E0318 10:50:42.189384 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:50:56 crc kubenswrapper[4778]: I0318 10:50:56.187413 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:50:56 crc kubenswrapper[4778]: E0318 10:50:56.188813 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:51:07 crc kubenswrapper[4778]: I0318 10:51:07.188423 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:51:07 crc kubenswrapper[4778]: E0318 10:51:07.189718 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:51:19 crc kubenswrapper[4778]: I0318 10:51:19.188068 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:51:19 crc kubenswrapper[4778]: E0318 10:51:19.189078 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:51:32 crc kubenswrapper[4778]: I0318 10:51:32.187379 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:51:32 crc kubenswrapper[4778]: E0318 10:51:32.189226 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:51:47 crc kubenswrapper[4778]: I0318 10:51:47.188758 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:51:47 crc kubenswrapper[4778]: E0318 10:51:47.189744 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.145325 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563852-h8wf4"] Mar 18 10:52:00 crc kubenswrapper[4778]: E0318 10:52:00.146547 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc499d31-e373-413b-8a38-1fa69f007f2f" containerName="oc" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.146563 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc499d31-e373-413b-8a38-1fa69f007f2f" containerName="oc" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.146773 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc499d31-e373-413b-8a38-1fa69f007f2f" containerName="oc" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.147681 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.149519 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.150090 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.151498 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.155574 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-h8wf4"] Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.239912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh6d4\" (UniqueName: \"kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4\") pod \"auto-csr-approver-29563852-h8wf4\" (UID: \"902826c1-406d-4d16-8655-4a85ff4a3205\") " pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.344176 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh6d4\" (UniqueName: \"kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4\") pod \"auto-csr-approver-29563852-h8wf4\" (UID: \"902826c1-406d-4d16-8655-4a85ff4a3205\") " pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.367263 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh6d4\" (UniqueName: \"kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4\") pod \"auto-csr-approver-29563852-h8wf4\" (UID: \"902826c1-406d-4d16-8655-4a85ff4a3205\") " pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.469956 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:00 crc kubenswrapper[4778]: I0318 10:52:00.926103 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-h8wf4"] Mar 18 10:52:01 crc kubenswrapper[4778]: I0318 10:52:01.613558 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" event={"ID":"902826c1-406d-4d16-8655-4a85ff4a3205","Type":"ContainerStarted","Data":"5edfd3af9b234546d9080f358abf5f5b13e6c14f3ab7072698599281d474283c"} Mar 18 10:52:02 crc kubenswrapper[4778]: I0318 10:52:02.187053 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:52:02 crc kubenswrapper[4778]: E0318 10:52:02.187563 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:52:02 crc kubenswrapper[4778]: I0318 10:52:02.628657 4778 generic.go:334] "Generic (PLEG): container finished" podID="902826c1-406d-4d16-8655-4a85ff4a3205" containerID="96707a6d26a4e59376b5ccb6d995399c8158c2cfc047ca02c91e8c3ceb00d6d6" exitCode=0 Mar 18 10:52:02 crc kubenswrapper[4778]: I0318 10:52:02.628742 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" event={"ID":"902826c1-406d-4d16-8655-4a85ff4a3205","Type":"ContainerDied","Data":"96707a6d26a4e59376b5ccb6d995399c8158c2cfc047ca02c91e8c3ceb00d6d6"} Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.025721 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.118429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh6d4\" (UniqueName: \"kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4\") pod \"902826c1-406d-4d16-8655-4a85ff4a3205\" (UID: \"902826c1-406d-4d16-8655-4a85ff4a3205\") " Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.128048 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4" (OuterVolumeSpecName: "kube-api-access-bh6d4") pod "902826c1-406d-4d16-8655-4a85ff4a3205" (UID: "902826c1-406d-4d16-8655-4a85ff4a3205"). InnerVolumeSpecName "kube-api-access-bh6d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.221713 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh6d4\" (UniqueName: \"kubernetes.io/projected/902826c1-406d-4d16-8655-4a85ff4a3205-kube-api-access-bh6d4\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.649778 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" event={"ID":"902826c1-406d-4d16-8655-4a85ff4a3205","Type":"ContainerDied","Data":"5edfd3af9b234546d9080f358abf5f5b13e6c14f3ab7072698599281d474283c"} Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.649818 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5edfd3af9b234546d9080f358abf5f5b13e6c14f3ab7072698599281d474283c" Mar 18 10:52:04 crc kubenswrapper[4778]: I0318 10:52:04.649872 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-h8wf4" Mar 18 10:52:05 crc kubenswrapper[4778]: I0318 10:52:05.112396 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-jwrl9"] Mar 18 10:52:05 crc kubenswrapper[4778]: I0318 10:52:05.124786 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-jwrl9"] Mar 18 10:52:05 crc kubenswrapper[4778]: I0318 10:52:05.663491 4778 generic.go:334] "Generic (PLEG): container finished" podID="c5a7a532-f8c2-4741-9892-65047a4cb225" containerID="44ebfbf33b960c39e1ffc52c3185dc0dc1ec7c33f6f6b2ba0c1b6ca80065a482" exitCode=0 Mar 18 10:52:05 crc kubenswrapper[4778]: I0318 10:52:05.663533 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"c5a7a532-f8c2-4741-9892-65047a4cb225","Type":"ContainerDied","Data":"44ebfbf33b960c39e1ffc52c3185dc0dc1ec7c33f6f6b2ba0c1b6ca80065a482"} Mar 18 10:52:06 crc kubenswrapper[4778]: I0318 10:52:06.197806 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed10f0a-3d2d-483e-9532-dd1f7b38631b" path="/var/lib/kubelet/pods/2ed10f0a-3d2d-483e-9532-dd1f7b38631b/volumes" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.086906 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205360 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205539 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205602 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205644 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205693 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205715 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx57k\" (UniqueName: \"kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205762 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205847 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205870 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.205889 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key\") pod \"c5a7a532-f8c2-4741-9892-65047a4cb225\" (UID: \"c5a7a532-f8c2-4741-9892-65047a4cb225\") " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.206264 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.206380 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data" (OuterVolumeSpecName: "config-data") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.206865 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.206889 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.211987 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.212177 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph" (OuterVolumeSpecName: "ceph") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.212595 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k" (OuterVolumeSpecName: "kube-api-access-sx57k") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "kube-api-access-sx57k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.223760 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.234970 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.235470 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.237355 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.264365 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c5a7a532-f8c2-4741-9892-65047a4cb225" (UID: "c5a7a532-f8c2-4741-9892-65047a4cb225"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.309240 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.309280 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.309326 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.309342 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx57k\" (UniqueName: \"kubernetes.io/projected/c5a7a532-f8c2-4741-9892-65047a4cb225-kube-api-access-sx57k\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.309358 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.319910 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c5a7a532-f8c2-4741-9892-65047a4cb225-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.319927 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.319943 4778 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a7a532-f8c2-4741-9892-65047a4cb225-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.334427 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.422235 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.681626 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"c5a7a532-f8c2-4741-9892-65047a4cb225","Type":"ContainerDied","Data":"1ee8cf024ce4398d1f0ed48d786bbb6b3add9e2f95a7fd4bf27b0fad0caf4251"} Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.681708 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ee8cf024ce4398d1f0ed48d786bbb6b3add9e2f95a7fd4bf27b0fad0caf4251" Mar 18 10:52:07 crc kubenswrapper[4778]: I0318 10:52:07.681720 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.093936 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 10:52:17 crc kubenswrapper[4778]: E0318 10:52:17.095079 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a7a532-f8c2-4741-9892-65047a4cb225" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.095100 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a7a532-f8c2-4741-9892-65047a4cb225" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:52:17 crc kubenswrapper[4778]: E0318 10:52:17.095135 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902826c1-406d-4d16-8655-4a85ff4a3205" containerName="oc" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.095150 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="902826c1-406d-4d16-8655-4a85ff4a3205" containerName="oc" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.095506 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="902826c1-406d-4d16-8655-4a85ff4a3205" containerName="oc" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.095548 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a7a532-f8c2-4741-9892-65047a4cb225" containerName="tempest-tests-tempest-tests-runner" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.096512 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.099709 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-htxt6" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.112690 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.187663 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:52:17 crc kubenswrapper[4778]: E0318 10:52:17.187966 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.266102 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-562p4\" (UniqueName: \"kubernetes.io/projected/fb176b71-d782-4b0d-963f-94acef50cf11-kube-api-access-562p4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.266159 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.368033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-562p4\" (UniqueName: \"kubernetes.io/projected/fb176b71-d782-4b0d-963f-94acef50cf11-kube-api-access-562p4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.368327 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.368783 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.396758 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-562p4\" (UniqueName: \"kubernetes.io/projected/fb176b71-d782-4b0d-963f-94acef50cf11-kube-api-access-562p4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.404305 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fb176b71-d782-4b0d-963f-94acef50cf11\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.428698 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 10:52:17 crc kubenswrapper[4778]: I0318 10:52:17.891926 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 10:52:17 crc kubenswrapper[4778]: W0318 10:52:17.901223 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb176b71_d782_4b0d_963f_94acef50cf11.slice/crio-653d5a825531d8590579aa864537696a66078d42eb7ce52cda9b0eef0c268fd9 WatchSource:0}: Error finding container 653d5a825531d8590579aa864537696a66078d42eb7ce52cda9b0eef0c268fd9: Status 404 returned error can't find the container with id 653d5a825531d8590579aa864537696a66078d42eb7ce52cda9b0eef0c268fd9 Mar 18 10:52:18 crc kubenswrapper[4778]: I0318 10:52:18.797395 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"fb176b71-d782-4b0d-963f-94acef50cf11","Type":"ContainerStarted","Data":"653d5a825531d8590579aa864537696a66078d42eb7ce52cda9b0eef0c268fd9"} Mar 18 10:52:19 crc kubenswrapper[4778]: I0318 10:52:19.811583 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"fb176b71-d782-4b0d-963f-94acef50cf11","Type":"ContainerStarted","Data":"22db3cee620bf797616b27ba0cc8fb619ec2da9044209f42c698e36d77066088"} Mar 18 10:52:19 crc kubenswrapper[4778]: I0318 10:52:19.841878 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.912294358 podStartE2EDuration="2.841859875s" podCreationTimestamp="2026-03-18 10:52:17 +0000 UTC" firstStartedPulling="2026-03-18 10:52:17.904493526 +0000 UTC m=+6604.479238366" lastFinishedPulling="2026-03-18 10:52:18.834059043 +0000 UTC m=+6605.408803883" observedRunningTime="2026-03-18 10:52:19.831922956 +0000 UTC m=+6606.406667836" watchObservedRunningTime="2026-03-18 10:52:19.841859875 +0000 UTC m=+6606.416604715" Mar 18 10:52:31 crc kubenswrapper[4778]: I0318 10:52:31.187755 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:52:31 crc kubenswrapper[4778]: E0318 10:52:31.188932 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:52:37 crc kubenswrapper[4778]: I0318 10:52:37.155323 4778 scope.go:117] "RemoveContainer" containerID="a51630f7ff38c957b6d8be33f92679338164d3fd19d236304cf23699728f1e4b" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.893946 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.897050 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.901104 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-config-0" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.901256 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"tobiko-secret" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.901689 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-private-key-0" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.901748 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"test-operator-clouds-config" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.901759 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-public-key-0" Mar 18 10:52:38 crc kubenswrapper[4778]: I0318 10:52:38.906922 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050017 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050092 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxhtn\" (UniqueName: \"kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050124 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050181 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050421 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050464 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050539 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050598 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050692 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050900 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.050956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.051039 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152489 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152604 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152641 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152720 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152764 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152820 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxhtn\" (UniqueName: \"kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152864 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152903 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152939 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152962 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.152986 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.153019 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.153688 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.153729 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.154666 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.154691 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.155773 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.156321 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.156626 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.159385 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.159635 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.167051 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.169356 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.171024 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxhtn\" (UniqueName: \"kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.180813 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.262749 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:52:39 crc kubenswrapper[4778]: I0318 10:52:39.879042 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Mar 18 10:52:40 crc kubenswrapper[4778]: I0318 10:52:40.052535 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557","Type":"ContainerStarted","Data":"aebcfed3451b3e3d8b172a76c2bb743ffd3f47051e3238bd0316471042774306"} Mar 18 10:52:44 crc kubenswrapper[4778]: I0318 10:52:44.196078 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:52:44 crc kubenswrapper[4778]: E0318 10:52:44.197067 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.600644 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.603457 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.613079 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.675668 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.675736 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2m7j\" (UniqueName: \"kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.675835 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.777705 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.777792 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2m7j\" (UniqueName: \"kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.777819 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.778480 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.778746 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.799222 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2m7j\" (UniqueName: \"kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j\") pod \"community-operators-6s7qj\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:49 crc kubenswrapper[4778]: I0318 10:52:49.936427 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:52:54 crc kubenswrapper[4778]: I0318 10:52:54.165710 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:52:54 crc kubenswrapper[4778]: W0318 10:52:54.180567 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3bf4465_218c_43ec_84d3_9881b5d329ea.slice/crio-d13bbae561ed6205993db40c0a89f43d0f77fc3e9bdbd8f834db82cc8dcd4c54 WatchSource:0}: Error finding container d13bbae561ed6205993db40c0a89f43d0f77fc3e9bdbd8f834db82cc8dcd4c54: Status 404 returned error can't find the container with id d13bbae561ed6205993db40c0a89f43d0f77fc3e9bdbd8f834db82cc8dcd4c54 Mar 18 10:52:55 crc kubenswrapper[4778]: I0318 10:52:55.195852 4778 generic.go:334] "Generic (PLEG): container finished" podID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerID="7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb" exitCode=0 Mar 18 10:52:55 crc kubenswrapper[4778]: I0318 10:52:55.197680 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerDied","Data":"7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb"} Mar 18 10:52:55 crc kubenswrapper[4778]: I0318 10:52:55.197972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerStarted","Data":"d13bbae561ed6205993db40c0a89f43d0f77fc3e9bdbd8f834db82cc8dcd4c54"} Mar 18 10:52:55 crc kubenswrapper[4778]: I0318 10:52:55.197988 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557","Type":"ContainerStarted","Data":"ba5823dd6f9a4d25d340574d43a268718dced1548735d14a1482f6123dc8e01d"} Mar 18 10:52:55 crc kubenswrapper[4778]: I0318 10:52:55.254115 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" podStartSLOduration=4.3420062999999995 podStartE2EDuration="18.254093787s" podCreationTimestamp="2026-03-18 10:52:37 +0000 UTC" firstStartedPulling="2026-03-18 10:52:39.889090731 +0000 UTC m=+6626.463835591" lastFinishedPulling="2026-03-18 10:52:53.801178198 +0000 UTC m=+6640.375923078" observedRunningTime="2026-03-18 10:52:55.242600936 +0000 UTC m=+6641.817345826" watchObservedRunningTime="2026-03-18 10:52:55.254093787 +0000 UTC m=+6641.828838637" Mar 18 10:52:56 crc kubenswrapper[4778]: I0318 10:52:56.212997 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerStarted","Data":"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908"} Mar 18 10:52:57 crc kubenswrapper[4778]: I0318 10:52:57.220765 4778 generic.go:334] "Generic (PLEG): container finished" podID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerID="3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908" exitCode=0 Mar 18 10:52:57 crc kubenswrapper[4778]: I0318 10:52:57.220814 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerDied","Data":"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908"} Mar 18 10:52:58 crc kubenswrapper[4778]: I0318 10:52:58.187891 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:52:58 crc kubenswrapper[4778]: E0318 10:52:58.188516 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:00 crc kubenswrapper[4778]: I0318 10:53:00.254324 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerStarted","Data":"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff"} Mar 18 10:53:00 crc kubenswrapper[4778]: I0318 10:53:00.273019 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6s7qj" podStartSLOduration=7.133258259 podStartE2EDuration="11.273002993s" podCreationTimestamp="2026-03-18 10:52:49 +0000 UTC" firstStartedPulling="2026-03-18 10:52:55.19876377 +0000 UTC m=+6641.773508610" lastFinishedPulling="2026-03-18 10:52:59.338508504 +0000 UTC m=+6645.913253344" observedRunningTime="2026-03-18 10:53:00.270462245 +0000 UTC m=+6646.845207115" watchObservedRunningTime="2026-03-18 10:53:00.273002993 +0000 UTC m=+6646.847747833" Mar 18 10:53:09 crc kubenswrapper[4778]: I0318 10:53:09.963874 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:09 crc kubenswrapper[4778]: I0318 10:53:09.965088 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:10 crc kubenswrapper[4778]: I0318 10:53:10.038299 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:10 crc kubenswrapper[4778]: I0318 10:53:10.187046 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:53:10 crc kubenswrapper[4778]: E0318 10:53:10.187364 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:10 crc kubenswrapper[4778]: I0318 10:53:10.432769 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:10 crc kubenswrapper[4778]: I0318 10:53:10.500142 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:53:12 crc kubenswrapper[4778]: I0318 10:53:12.381284 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6s7qj" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="registry-server" containerID="cri-o://e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff" gracePeriod=2 Mar 18 10:53:12 crc kubenswrapper[4778]: I0318 10:53:12.841056 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.031254 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m7j\" (UniqueName: \"kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j\") pod \"c3bf4465-218c-43ec-84d3-9881b5d329ea\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.031498 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content\") pod \"c3bf4465-218c-43ec-84d3-9881b5d329ea\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.031535 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities\") pod \"c3bf4465-218c-43ec-84d3-9881b5d329ea\" (UID: \"c3bf4465-218c-43ec-84d3-9881b5d329ea\") " Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.032359 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities" (OuterVolumeSpecName: "utilities") pod "c3bf4465-218c-43ec-84d3-9881b5d329ea" (UID: "c3bf4465-218c-43ec-84d3-9881b5d329ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.045943 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j" (OuterVolumeSpecName: "kube-api-access-x2m7j") pod "c3bf4465-218c-43ec-84d3-9881b5d329ea" (UID: "c3bf4465-218c-43ec-84d3-9881b5d329ea"). InnerVolumeSpecName "kube-api-access-x2m7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.115861 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3bf4465-218c-43ec-84d3-9881b5d329ea" (UID: "c3bf4465-218c-43ec-84d3-9881b5d329ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.133942 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.133978 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf4465-218c-43ec-84d3-9881b5d329ea-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.133989 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m7j\" (UniqueName: \"kubernetes.io/projected/c3bf4465-218c-43ec-84d3-9881b5d329ea-kube-api-access-x2m7j\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.396755 4778 generic.go:334] "Generic (PLEG): container finished" podID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerID="e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff" exitCode=0 Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.396825 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s7qj" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.396851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerDied","Data":"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff"} Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.397439 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s7qj" event={"ID":"c3bf4465-218c-43ec-84d3-9881b5d329ea","Type":"ContainerDied","Data":"d13bbae561ed6205993db40c0a89f43d0f77fc3e9bdbd8f834db82cc8dcd4c54"} Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.397556 4778 scope.go:117] "RemoveContainer" containerID="e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.427972 4778 scope.go:117] "RemoveContainer" containerID="3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.441530 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.451645 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6s7qj"] Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.459164 4778 scope.go:117] "RemoveContainer" containerID="7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.511810 4778 scope.go:117] "RemoveContainer" containerID="e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff" Mar 18 10:53:13 crc kubenswrapper[4778]: E0318 10:53:13.512274 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff\": container with ID starting with e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff not found: ID does not exist" containerID="e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.512375 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff"} err="failed to get container status \"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff\": rpc error: code = NotFound desc = could not find container \"e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff\": container with ID starting with e67b7a419b9e9edd0b697bbd7f29b359f734d401e7cc3b20d025dd429363e9ff not found: ID does not exist" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.512399 4778 scope.go:117] "RemoveContainer" containerID="3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908" Mar 18 10:53:13 crc kubenswrapper[4778]: E0318 10:53:13.512735 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908\": container with ID starting with 3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908 not found: ID does not exist" containerID="3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.512780 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908"} err="failed to get container status \"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908\": rpc error: code = NotFound desc = could not find container \"3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908\": container with ID starting with 3421c36aaa0e598f9c75efb09ca4e53ccc67889db6712d726161d5d919610908 not found: ID does not exist" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.512807 4778 scope.go:117] "RemoveContainer" containerID="7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb" Mar 18 10:53:13 crc kubenswrapper[4778]: E0318 10:53:13.513283 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb\": container with ID starting with 7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb not found: ID does not exist" containerID="7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb" Mar 18 10:53:13 crc kubenswrapper[4778]: I0318 10:53:13.513308 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb"} err="failed to get container status \"7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb\": rpc error: code = NotFound desc = could not find container \"7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb\": container with ID starting with 7ec8c8089ffc91b932d845a3e0e7dfae94cfe46d5c0928340b6d5f0a55ec38eb not found: ID does not exist" Mar 18 10:53:14 crc kubenswrapper[4778]: I0318 10:53:14.210739 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" path="/var/lib/kubelet/pods/c3bf4465-218c-43ec-84d3-9881b5d329ea/volumes" Mar 18 10:53:24 crc kubenswrapper[4778]: I0318 10:53:24.203311 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:53:24 crc kubenswrapper[4778]: E0318 10:53:24.204769 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:35 crc kubenswrapper[4778]: I0318 10:53:35.188104 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:53:35 crc kubenswrapper[4778]: E0318 10:53:35.189576 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:47 crc kubenswrapper[4778]: I0318 10:53:47.188671 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:53:47 crc kubenswrapper[4778]: E0318 10:53:47.189523 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:55 crc kubenswrapper[4778]: I0318 10:53:55.833632 4778 generic.go:334] "Generic (PLEG): container finished" podID="5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" containerID="ba5823dd6f9a4d25d340574d43a268718dced1548735d14a1482f6123dc8e01d" exitCode=0 Mar 18 10:53:55 crc kubenswrapper[4778]: I0318 10:53:55.834410 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557","Type":"ContainerDied","Data":"ba5823dd6f9a4d25d340574d43a268718dced1548735d14a1482f6123dc8e01d"} Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.307506 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397143 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Mar 18 10:53:57 crc kubenswrapper[4778]: E0318 10:53:57.397504 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" containerName="tobiko-tests-tobiko" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397525 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" containerName="tobiko-tests-tobiko" Mar 18 10:53:57 crc kubenswrapper[4778]: E0318 10:53:57.397558 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="registry-server" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397566 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="registry-server" Mar 18 10:53:57 crc kubenswrapper[4778]: E0318 10:53:57.397580 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="extract-utilities" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397588 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="extract-utilities" Mar 18 10:53:57 crc kubenswrapper[4778]: E0318 10:53:57.397604 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="extract-content" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397610 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="extract-content" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397780 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" containerName="tobiko-tests-tobiko" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.397812 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bf4465-218c-43ec-84d3-9881b5d329ea" containerName="registry-server" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.398584 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.401146 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-config-1" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.402052 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-private-key-1" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.403014 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobiko-tobiko-public-key-1" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.412931 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489264 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489380 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489403 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489436 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489575 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489608 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489640 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489665 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489705 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489742 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489771 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxhtn\" (UniqueName: \"kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.489872 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret\") pod \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\" (UID: \"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557\") " Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.491034 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.494434 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph" (OuterVolumeSpecName: "ceph") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.494852 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn" (OuterVolumeSpecName: "kube-api-access-dxhtn") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "kube-api-access-dxhtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.499264 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.515944 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config" (OuterVolumeSpecName: "tobiko-config") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "tobiko-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.517893 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.524213 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key" (OuterVolumeSpecName: "tobiko-private-key") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "tobiko-private-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.532445 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig" (OuterVolumeSpecName: "kubeconfig") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "kubeconfig". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.542872 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key" (OuterVolumeSpecName: "tobiko-public-key") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "tobiko-public-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.558988 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.574813 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592446 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592491 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592513 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592531 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592623 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592662 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592685 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592706 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592757 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592781 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592812 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmpg\" (UniqueName: \"kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592844 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592917 4778 reconciler_common.go:293] "Volume detached for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kubeconfig\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.592993 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593031 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593046 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-public-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593057 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593068 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593078 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-tobiko-private-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593088 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxhtn\" (UniqueName: \"kubernetes.io/projected/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-kube-api-access-dxhtn\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593105 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.593115 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.627080 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695009 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695354 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695481 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695524 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695578 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695624 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695642 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695676 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695705 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmpg\" (UniqueName: \"kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.695766 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.697060 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.697726 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.698301 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.698621 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.699008 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.699490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.699876 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.700859 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.701081 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.712744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmpg\" (UniqueName: \"kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.723030 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.857176 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"5c0d8cb1-d7bc-4694-ac54-e0a9f8312557","Type":"ContainerDied","Data":"aebcfed3451b3e3d8b172a76c2bb743ffd3f47051e3238bd0316471042774306"} Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.857598 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aebcfed3451b3e3d8b172a76c2bb743ffd3f47051e3238bd0316471042774306" Mar 18 10:53:57 crc kubenswrapper[4778]: I0318 10:53:57.857652 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Mar 18 10:53:58 crc kubenswrapper[4778]: I0318 10:53:58.190408 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:53:58 crc kubenswrapper[4778]: E0318 10:53:58.190733 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:53:58 crc kubenswrapper[4778]: I0318 10:53:58.249525 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Mar 18 10:53:58 crc kubenswrapper[4778]: I0318 10:53:58.825248 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557" (UID: "5c0d8cb1-d7bc-4694-ac54-e0a9f8312557"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:53:58 crc kubenswrapper[4778]: I0318 10:53:58.826000 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5c0d8cb1-d7bc-4694-ac54-e0a9f8312557-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:58 crc kubenswrapper[4778]: I0318 10:53:58.868737 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"bd565818-8912-47ba-881f-f88011fa9b46","Type":"ContainerStarted","Data":"9010a2462d1e9f7fe8f1670549c3a224043e894c401167ef3df32c9556413256"} Mar 18 10:53:59 crc kubenswrapper[4778]: I0318 10:53:59.880607 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"bd565818-8912-47ba-881f-f88011fa9b46","Type":"ContainerStarted","Data":"878a250da8e9fd68a2017bd74707da9dc4870b9273766f35be6449c7f483e262"} Mar 18 10:53:59 crc kubenswrapper[4778]: I0318 10:53:59.910767 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tobiko-tests-tobiko-s01-sanity" podStartSLOduration=2.910746937 podStartE2EDuration="2.910746937s" podCreationTimestamp="2026-03-18 10:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:53:59.901457066 +0000 UTC m=+6706.476201916" watchObservedRunningTime="2026-03-18 10:53:59.910746937 +0000 UTC m=+6706.485491787" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.132936 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563854-qqn9z"] Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.134465 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.138578 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.138750 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.139383 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.154679 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-qqn9z"] Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.257694 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqwq8\" (UniqueName: \"kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8\") pod \"auto-csr-approver-29563854-qqn9z\" (UID: \"f6103ea7-c41a-40d2-ae16-15f066c955b9\") " pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.359829 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqwq8\" (UniqueName: \"kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8\") pod \"auto-csr-approver-29563854-qqn9z\" (UID: \"f6103ea7-c41a-40d2-ae16-15f066c955b9\") " pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.393141 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqwq8\" (UniqueName: \"kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8\") pod \"auto-csr-approver-29563854-qqn9z\" (UID: \"f6103ea7-c41a-40d2-ae16-15f066c955b9\") " pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.452715 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:00 crc kubenswrapper[4778]: I0318 10:54:00.955649 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-qqn9z"] Mar 18 10:54:00 crc kubenswrapper[4778]: W0318 10:54:00.960289 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6103ea7_c41a_40d2_ae16_15f066c955b9.slice/crio-785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987 WatchSource:0}: Error finding container 785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987: Status 404 returned error can't find the container with id 785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987 Mar 18 10:54:01 crc kubenswrapper[4778]: I0318 10:54:01.899177 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" event={"ID":"f6103ea7-c41a-40d2-ae16-15f066c955b9","Type":"ContainerStarted","Data":"785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987"} Mar 18 10:54:02 crc kubenswrapper[4778]: I0318 10:54:02.913436 4778 generic.go:334] "Generic (PLEG): container finished" podID="f6103ea7-c41a-40d2-ae16-15f066c955b9" containerID="20d0876a2852471421fd6830f32cec9b8955b6abdc480edeb7e2a46c81a72c97" exitCode=0 Mar 18 10:54:02 crc kubenswrapper[4778]: I0318 10:54:02.913514 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" event={"ID":"f6103ea7-c41a-40d2-ae16-15f066c955b9","Type":"ContainerDied","Data":"20d0876a2852471421fd6830f32cec9b8955b6abdc480edeb7e2a46c81a72c97"} Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.306869 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.447557 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqwq8\" (UniqueName: \"kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8\") pod \"f6103ea7-c41a-40d2-ae16-15f066c955b9\" (UID: \"f6103ea7-c41a-40d2-ae16-15f066c955b9\") " Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.453740 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8" (OuterVolumeSpecName: "kube-api-access-zqwq8") pod "f6103ea7-c41a-40d2-ae16-15f066c955b9" (UID: "f6103ea7-c41a-40d2-ae16-15f066c955b9"). InnerVolumeSpecName "kube-api-access-zqwq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.549823 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqwq8\" (UniqueName: \"kubernetes.io/projected/f6103ea7-c41a-40d2-ae16-15f066c955b9-kube-api-access-zqwq8\") on node \"crc\" DevicePath \"\"" Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.951483 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" event={"ID":"f6103ea7-c41a-40d2-ae16-15f066c955b9","Type":"ContainerDied","Data":"785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987"} Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.951546 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-qqn9z" Mar 18 10:54:04 crc kubenswrapper[4778]: I0318 10:54:04.951561 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="785ee8cb881253f21a70135f3063962bfdc3def3aaa5ad76eb532fe55d73a987" Mar 18 10:54:05 crc kubenswrapper[4778]: I0318 10:54:05.414034 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-f28cg"] Mar 18 10:54:05 crc kubenswrapper[4778]: I0318 10:54:05.421673 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-f28cg"] Mar 18 10:54:06 crc kubenswrapper[4778]: I0318 10:54:06.204262 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49327474-2bad-4ebc-b955-bf9dd1268c5e" path="/var/lib/kubelet/pods/49327474-2bad-4ebc-b955-bf9dd1268c5e/volumes" Mar 18 10:54:11 crc kubenswrapper[4778]: I0318 10:54:11.187916 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:54:11 crc kubenswrapper[4778]: E0318 10:54:11.188592 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:54:23 crc kubenswrapper[4778]: I0318 10:54:23.187365 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:54:23 crc kubenswrapper[4778]: E0318 10:54:23.188717 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:54:36 crc kubenswrapper[4778]: I0318 10:54:36.186748 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:54:36 crc kubenswrapper[4778]: E0318 10:54:36.187441 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:54:37 crc kubenswrapper[4778]: I0318 10:54:37.303776 4778 scope.go:117] "RemoveContainer" containerID="f8a6db8312e875033fdd1f73d7983e85a274f3fbde6864a2faaec123c194e5c8" Mar 18 10:54:47 crc kubenswrapper[4778]: I0318 10:54:47.187963 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:54:47 crc kubenswrapper[4778]: E0318 10:54:47.189212 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:55:01 crc kubenswrapper[4778]: I0318 10:55:01.187689 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:55:01 crc kubenswrapper[4778]: E0318 10:55:01.188763 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:55:16 crc kubenswrapper[4778]: I0318 10:55:16.188009 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:55:16 crc kubenswrapper[4778]: E0318 10:55:16.188923 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 10:55:19 crc kubenswrapper[4778]: I0318 10:55:19.758113 4778 generic.go:334] "Generic (PLEG): container finished" podID="bd565818-8912-47ba-881f-f88011fa9b46" containerID="878a250da8e9fd68a2017bd74707da9dc4870b9273766f35be6449c7f483e262" exitCode=0 Mar 18 10:55:19 crc kubenswrapper[4778]: I0318 10:55:19.758164 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"bd565818-8912-47ba-881f-f88011fa9b46","Type":"ContainerDied","Data":"878a250da8e9fd68a2017bd74707da9dc4870b9273766f35be6449c7f483e262"} Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.277354 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.478940 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479099 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swmpg\" (UniqueName: \"kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479250 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479303 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479416 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479524 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479682 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479848 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479934 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.479991 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.480037 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph\") pod \"bd565818-8912-47ba-881f-f88011fa9b46\" (UID: \"bd565818-8912-47ba-881f-f88011fa9b46\") " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.484260 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.485771 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.487361 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg" (OuterVolumeSpecName: "kube-api-access-swmpg") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "kube-api-access-swmpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.489577 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph" (OuterVolumeSpecName: "ceph") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.513574 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config" (OuterVolumeSpecName: "tobiko-config") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "tobiko-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.533293 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key" (OuterVolumeSpecName: "tobiko-public-key") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "tobiko-public-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.538895 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key" (OuterVolumeSpecName: "tobiko-private-key") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "tobiko-private-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.541930 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig" (OuterVolumeSpecName: "kubeconfig") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "kubeconfig". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.550444 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.551414 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584081 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584131 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swmpg\" (UniqueName: \"kubernetes.io/projected/bd565818-8912-47ba-881f-f88011fa9b46-kube-api-access-swmpg\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584150 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584166 4778 reconciler_common.go:293] "Volume detached for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-kubeconfig\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584180 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-private-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584194 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584241 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584261 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584279 4778 reconciler_common.go:293] "Volume detached for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-tobiko-public-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.584296 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bd565818-8912-47ba-881f-f88011fa9b46-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.591317 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.627275 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.687114 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.687193 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.780377 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"bd565818-8912-47ba-881f-f88011fa9b46","Type":"ContainerDied","Data":"9010a2462d1e9f7fe8f1670549c3a224043e894c401167ef3df32c9556413256"} Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.780432 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9010a2462d1e9f7fe8f1670549c3a224043e894c401167ef3df32c9556413256" Mar 18 10:55:21 crc kubenswrapper[4778]: I0318 10:55:21.780464 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Mar 18 10:55:22 crc kubenswrapper[4778]: I0318 10:55:22.882012 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "bd565818-8912-47ba-881f-f88011fa9b46" (UID: "bd565818-8912-47ba-881f-f88011fa9b46"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:55:22 crc kubenswrapper[4778]: I0318 10:55:22.914737 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd565818-8912-47ba-881f-f88011fa9b46-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.187805 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.743003 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Mar 18 10:55:31 crc kubenswrapper[4778]: E0318 10:55:31.744155 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6103ea7-c41a-40d2-ae16-15f066c955b9" containerName="oc" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.744168 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6103ea7-c41a-40d2-ae16-15f066c955b9" containerName="oc" Mar 18 10:55:31 crc kubenswrapper[4778]: E0318 10:55:31.744228 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd565818-8912-47ba-881f-f88011fa9b46" containerName="tobiko-tests-tobiko" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.744236 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd565818-8912-47ba-881f-f88011fa9b46" containerName="tobiko-tests-tobiko" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.744454 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6103ea7-c41a-40d2-ae16-15f066c955b9" containerName="oc" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.744468 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd565818-8912-47ba-881f-f88011fa9b46" containerName="tobiko-tests-tobiko" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.745266 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.756420 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.900015 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265"} Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.935032 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfch6\" (UniqueName: \"kubernetes.io/projected/4e028d5e-666c-497c-949e-97860410ad74-kube-api-access-pfch6\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:31 crc kubenswrapper[4778]: I0318 10:55:31.935111 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.037503 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfch6\" (UniqueName: \"kubernetes.io/projected/4e028d5e-666c-497c-949e-97860410ad74-kube-api-access-pfch6\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.037605 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.039259 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.069948 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfch6\" (UniqueName: \"kubernetes.io/projected/4e028d5e-666c-497c-949e-97860410ad74-kube-api-access-pfch6\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.077477 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"4e028d5e-666c-497c-949e-97860410ad74\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.117389 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.602525 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.604989 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:55:32 crc kubenswrapper[4778]: I0318 10:55:32.915781 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" event={"ID":"4e028d5e-666c-497c-949e-97860410ad74","Type":"ContainerStarted","Data":"e74038fb5b8acc694c4975a889116cd0644a3950ac18ae2c472be442026ead89"} Mar 18 10:55:33 crc kubenswrapper[4778]: I0318 10:55:33.927512 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" event={"ID":"4e028d5e-666c-497c-949e-97860410ad74","Type":"ContainerStarted","Data":"a738e69a3512731c93e7c4933c55a03b89c1c13e6428ba10eec5561e907a7643"} Mar 18 10:55:33 crc kubenswrapper[4778]: I0318 10:55:33.952334 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" podStartSLOduration=2.514090416 podStartE2EDuration="2.952294575s" podCreationTimestamp="2026-03-18 10:55:31 +0000 UTC" firstStartedPulling="2026-03-18 10:55:32.60463772 +0000 UTC m=+6799.179382580" lastFinishedPulling="2026-03-18 10:55:33.042841879 +0000 UTC m=+6799.617586739" observedRunningTime="2026-03-18 10:55:33.946789176 +0000 UTC m=+6800.521534056" watchObservedRunningTime="2026-03-18 10:55:33.952294575 +0000 UTC m=+6800.527039505" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.657051 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ansibletest-ansibletest"] Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.663016 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.669636 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.669892 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.674051 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansibletest-ansibletest"] Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773071 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773149 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773229 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773272 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j742h\" (UniqueName: \"kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773309 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773416 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773444 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773488 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.773534 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875162 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875261 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875312 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875349 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875379 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875414 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j742h\" (UniqueName: \"kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875486 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875509 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.875535 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.876153 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.876570 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.877379 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.877411 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.890344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.890419 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.891059 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.896787 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.904879 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.907754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j742h\" (UniqueName: \"kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.917701 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ansibletest-ansibletest\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " pod="openstack/ansibletest-ansibletest" Mar 18 10:55:45 crc kubenswrapper[4778]: I0318 10:55:45.980076 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Mar 18 10:55:46 crc kubenswrapper[4778]: I0318 10:55:46.457476 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansibletest-ansibletest"] Mar 18 10:55:47 crc kubenswrapper[4778]: I0318 10:55:47.113931 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9","Type":"ContainerStarted","Data":"ec28f4abe6951258d35c7175f6d4f29db741687b55b7cbd44e65672620fd1045"} Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.186910 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.190086 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.244336 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.338377 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.338507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mmnl\" (UniqueName: \"kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.338614 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.440346 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.440517 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.440709 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mmnl\" (UniqueName: \"kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.440831 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.440959 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.464176 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mmnl\" (UniqueName: \"kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl\") pod \"redhat-marketplace-5jcs8\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:55:52 crc kubenswrapper[4778]: I0318 10:55:52.529846 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.146760 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563856-kvmq4"] Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.148872 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.151220 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.151224 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.151766 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.159362 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-kvmq4"] Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.322214 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94vhq\" (UniqueName: \"kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq\") pod \"auto-csr-approver-29563856-kvmq4\" (UID: \"0b69a324-153a-4262-92ea-62c8b9d5928e\") " pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.424075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94vhq\" (UniqueName: \"kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq\") pod \"auto-csr-approver-29563856-kvmq4\" (UID: \"0b69a324-153a-4262-92ea-62c8b9d5928e\") " pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.446119 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94vhq\" (UniqueName: \"kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq\") pod \"auto-csr-approver-29563856-kvmq4\" (UID: \"0b69a324-153a-4262-92ea-62c8b9d5928e\") " pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:00 crc kubenswrapper[4778]: I0318 10:56:00.475367 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:03 crc kubenswrapper[4778]: E0318 10:56:03.247811 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified" Mar 18 10:56:03 crc kubenswrapper[4778]: E0318 10:56:03.248225 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:56:03 crc kubenswrapper[4778]: container &Container{Name:ansibletest-ansibletest,Image:quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_ANSIBLE_EXTRA_VARS,Value:-e manual_run=false,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_FILE_EXTRA_VARS,Value:--- Mar 18 10:56:03 crc kubenswrapper[4778]: foo: bar Mar 18 10:56:03 crc kubenswrapper[4778]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_BRANCH,Value:,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_REPO,Value:https://github.com/ansible/test-playbooks,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_INVENTORY,Value:localhost ansible_connection=local ansible_python_interpreter=python3 Mar 18 10:56:03 crc kubenswrapper[4778]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_PLAYBOOK,Value:./debug.yml,ValueFrom:nil,},EnvVar{Name:POD_DEBUG,Value:false,ValueFrom:nil,},EnvVar{Name:POD_INSTALL_COLLECTIONS,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{4 0} {} 4 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{2 0} {} 2 DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/ansible,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/AnsibleTests/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/ansible/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/var/lib/ansible/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:compute-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/.ssh/compute_id,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:workload-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/test_keypair.key,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j742h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*227,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*227,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ansibletest-ansibletest_openstack(1fb58f5e-1c8b-45e2-bf86-b81af58b66a9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 18 10:56:03 crc kubenswrapper[4778]: > logger="UnhandledError" Mar 18 10:56:03 crc kubenswrapper[4778]: E0318 10:56:03.249659 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansibletest-ansibletest\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ansibletest-ansibletest" podUID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" Mar 18 10:56:03 crc kubenswrapper[4778]: E0318 10:56:03.278612 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansibletest-ansibletest\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified\\\"\"" pod="openstack/ansibletest-ansibletest" podUID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" Mar 18 10:56:03 crc kubenswrapper[4778]: I0318 10:56:03.750939 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:56:03 crc kubenswrapper[4778]: I0318 10:56:03.775697 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-kvmq4"] Mar 18 10:56:04 crc kubenswrapper[4778]: I0318 10:56:04.284454 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" event={"ID":"0b69a324-153a-4262-92ea-62c8b9d5928e","Type":"ContainerStarted","Data":"1853e87f28a852e912f97a478758c1a4fbf38aebb6e3dd56004bc47fae74654b"} Mar 18 10:56:04 crc kubenswrapper[4778]: I0318 10:56:04.286362 4778 generic.go:334] "Generic (PLEG): container finished" podID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerID="92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef" exitCode=0 Mar 18 10:56:04 crc kubenswrapper[4778]: I0318 10:56:04.286393 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerDied","Data":"92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef"} Mar 18 10:56:04 crc kubenswrapper[4778]: I0318 10:56:04.286416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerStarted","Data":"cc671246021c63a76c31044da6da18f15794dcb492863055a35ffee65d602e4f"} Mar 18 10:56:05 crc kubenswrapper[4778]: I0318 10:56:05.296920 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerStarted","Data":"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899"} Mar 18 10:56:05 crc kubenswrapper[4778]: I0318 10:56:05.298886 4778 generic.go:334] "Generic (PLEG): container finished" podID="0b69a324-153a-4262-92ea-62c8b9d5928e" containerID="82c47033c6d17fb0d1f1f077c5ae48584be4ec251f8c624e7bed8591ae05dffd" exitCode=0 Mar 18 10:56:05 crc kubenswrapper[4778]: I0318 10:56:05.298925 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" event={"ID":"0b69a324-153a-4262-92ea-62c8b9d5928e","Type":"ContainerDied","Data":"82c47033c6d17fb0d1f1f077c5ae48584be4ec251f8c624e7bed8591ae05dffd"} Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.309842 4778 generic.go:334] "Generic (PLEG): container finished" podID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerID="eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899" exitCode=0 Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.309895 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerDied","Data":"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899"} Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.694954 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.767925 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94vhq\" (UniqueName: \"kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq\") pod \"0b69a324-153a-4262-92ea-62c8b9d5928e\" (UID: \"0b69a324-153a-4262-92ea-62c8b9d5928e\") " Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.778738 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq" (OuterVolumeSpecName: "kube-api-access-94vhq") pod "0b69a324-153a-4262-92ea-62c8b9d5928e" (UID: "0b69a324-153a-4262-92ea-62c8b9d5928e"). InnerVolumeSpecName "kube-api-access-94vhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:56:06 crc kubenswrapper[4778]: I0318 10:56:06.871614 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94vhq\" (UniqueName: \"kubernetes.io/projected/0b69a324-153a-4262-92ea-62c8b9d5928e-kube-api-access-94vhq\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:07 crc kubenswrapper[4778]: I0318 10:56:07.320243 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" event={"ID":"0b69a324-153a-4262-92ea-62c8b9d5928e","Type":"ContainerDied","Data":"1853e87f28a852e912f97a478758c1a4fbf38aebb6e3dd56004bc47fae74654b"} Mar 18 10:56:07 crc kubenswrapper[4778]: I0318 10:56:07.320281 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1853e87f28a852e912f97a478758c1a4fbf38aebb6e3dd56004bc47fae74654b" Mar 18 10:56:07 crc kubenswrapper[4778]: I0318 10:56:07.320329 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-kvmq4" Mar 18 10:56:07 crc kubenswrapper[4778]: I0318 10:56:07.770746 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-jrcnr"] Mar 18 10:56:07 crc kubenswrapper[4778]: I0318 10:56:07.779419 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-jrcnr"] Mar 18 10:56:08 crc kubenswrapper[4778]: I0318 10:56:08.197046 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc499d31-e373-413b-8a38-1fa69f007f2f" path="/var/lib/kubelet/pods/fc499d31-e373-413b-8a38-1fa69f007f2f/volumes" Mar 18 10:56:09 crc kubenswrapper[4778]: I0318 10:56:09.337582 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerStarted","Data":"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494"} Mar 18 10:56:09 crc kubenswrapper[4778]: I0318 10:56:09.356089 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5jcs8" podStartSLOduration=13.229774904 podStartE2EDuration="17.356067792s" podCreationTimestamp="2026-03-18 10:55:52 +0000 UTC" firstStartedPulling="2026-03-18 10:56:04.288515405 +0000 UTC m=+6830.863260255" lastFinishedPulling="2026-03-18 10:56:08.414808303 +0000 UTC m=+6834.989553143" observedRunningTime="2026-03-18 10:56:09.353691577 +0000 UTC m=+6835.928436427" watchObservedRunningTime="2026-03-18 10:56:09.356067792 +0000 UTC m=+6835.930812632" Mar 18 10:56:12 crc kubenswrapper[4778]: I0318 10:56:12.531651 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:12 crc kubenswrapper[4778]: I0318 10:56:12.531979 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:12 crc kubenswrapper[4778]: I0318 10:56:12.607217 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:13 crc kubenswrapper[4778]: I0318 10:56:13.433433 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:13 crc kubenswrapper[4778]: I0318 10:56:13.480577 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.389166 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5jcs8" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="registry-server" containerID="cri-o://855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494" gracePeriod=2 Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.879104 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.979173 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mmnl\" (UniqueName: \"kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl\") pod \"2b921080-6bfb-4a4d-b453-d5e2370a7558\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.979404 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content\") pod \"2b921080-6bfb-4a4d-b453-d5e2370a7558\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.979444 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities\") pod \"2b921080-6bfb-4a4d-b453-d5e2370a7558\" (UID: \"2b921080-6bfb-4a4d-b453-d5e2370a7558\") " Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.980813 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities" (OuterVolumeSpecName: "utilities") pod "2b921080-6bfb-4a4d-b453-d5e2370a7558" (UID: "2b921080-6bfb-4a4d-b453-d5e2370a7558"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:56:15 crc kubenswrapper[4778]: I0318 10:56:15.989173 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl" (OuterVolumeSpecName: "kube-api-access-9mmnl") pod "2b921080-6bfb-4a4d-b453-d5e2370a7558" (UID: "2b921080-6bfb-4a4d-b453-d5e2370a7558"). InnerVolumeSpecName "kube-api-access-9mmnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.011460 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b921080-6bfb-4a4d-b453-d5e2370a7558" (UID: "2b921080-6bfb-4a4d-b453-d5e2370a7558"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.082813 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mmnl\" (UniqueName: \"kubernetes.io/projected/2b921080-6bfb-4a4d-b453-d5e2370a7558-kube-api-access-9mmnl\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.082855 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.082870 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b921080-6bfb-4a4d-b453-d5e2370a7558-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.403504 4778 generic.go:334] "Generic (PLEG): container finished" podID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerID="855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494" exitCode=0 Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.403577 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jcs8" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.403605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerDied","Data":"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494"} Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.403954 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jcs8" event={"ID":"2b921080-6bfb-4a4d-b453-d5e2370a7558","Type":"ContainerDied","Data":"cc671246021c63a76c31044da6da18f15794dcb492863055a35ffee65d602e4f"} Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.403984 4778 scope.go:117] "RemoveContainer" containerID="855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.433112 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.438168 4778 scope.go:117] "RemoveContainer" containerID="eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.443756 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jcs8"] Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.463277 4778 scope.go:117] "RemoveContainer" containerID="92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.527794 4778 scope.go:117] "RemoveContainer" containerID="855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494" Mar 18 10:56:16 crc kubenswrapper[4778]: E0318 10:56:16.528294 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494\": container with ID starting with 855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494 not found: ID does not exist" containerID="855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.528328 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494"} err="failed to get container status \"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494\": rpc error: code = NotFound desc = could not find container \"855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494\": container with ID starting with 855e824eb89a9d487684c014c514d781fb6794a0a03c78bee1f0b6d008d78494 not found: ID does not exist" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.528351 4778 scope.go:117] "RemoveContainer" containerID="eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899" Mar 18 10:56:16 crc kubenswrapper[4778]: E0318 10:56:16.528679 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899\": container with ID starting with eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899 not found: ID does not exist" containerID="eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.528703 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899"} err="failed to get container status \"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899\": rpc error: code = NotFound desc = could not find container \"eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899\": container with ID starting with eed9656b32fd07359ec6cd9c3a1e1005914ba25fb28dfbf0d3058b1259270899 not found: ID does not exist" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.528717 4778 scope.go:117] "RemoveContainer" containerID="92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef" Mar 18 10:56:16 crc kubenswrapper[4778]: E0318 10:56:16.529034 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef\": container with ID starting with 92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef not found: ID does not exist" containerID="92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef" Mar 18 10:56:16 crc kubenswrapper[4778]: I0318 10:56:16.529061 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef"} err="failed to get container status \"92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef\": rpc error: code = NotFound desc = could not find container \"92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef\": container with ID starting with 92fab7a0e0684bdc6017d3c271db309e4a75794fc5c152b60ba7644b6b023eef not found: ID does not exist" Mar 18 10:56:18 crc kubenswrapper[4778]: I0318 10:56:18.197804 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" path="/var/lib/kubelet/pods/2b921080-6bfb-4a4d-b453-d5e2370a7558/volumes" Mar 18 10:56:18 crc kubenswrapper[4778]: I0318 10:56:18.427698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9","Type":"ContainerStarted","Data":"16b1f9a1400b5530a46aaeb11db97cc9f9066213e3702e8bb6c8ab6c4b6e6715"} Mar 18 10:56:18 crc kubenswrapper[4778]: I0318 10:56:18.452344 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ansibletest-ansibletest" podStartSLOduration=4.028479033 podStartE2EDuration="34.45231919s" podCreationTimestamp="2026-03-18 10:55:44 +0000 UTC" firstStartedPulling="2026-03-18 10:55:46.462265297 +0000 UTC m=+6813.037010137" lastFinishedPulling="2026-03-18 10:56:16.886105454 +0000 UTC m=+6843.460850294" observedRunningTime="2026-03-18 10:56:18.443888781 +0000 UTC m=+6845.018633631" watchObservedRunningTime="2026-03-18 10:56:18.45231919 +0000 UTC m=+6845.027064030" Mar 18 10:56:19 crc kubenswrapper[4778]: I0318 10:56:19.444082 4778 generic.go:334] "Generic (PLEG): container finished" podID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" containerID="16b1f9a1400b5530a46aaeb11db97cc9f9066213e3702e8bb6c8ab6c4b6e6715" exitCode=0 Mar 18 10:56:19 crc kubenswrapper[4778]: I0318 10:56:19.444187 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9","Type":"ContainerDied","Data":"16b1f9a1400b5530a46aaeb11db97cc9f9066213e3702e8bb6c8ab6c4b6e6715"} Mar 18 10:56:20 crc kubenswrapper[4778]: I0318 10:56:20.915736 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096456 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j742h\" (UniqueName: \"kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096541 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096569 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096618 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096670 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096707 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096805 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096847 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.096905 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir\") pod \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\" (UID: \"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9\") " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.098076 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.112727 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.113715 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph" (OuterVolumeSpecName: "ceph") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.118924 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.125354 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h" (OuterVolumeSpecName: "kube-api-access-j742h") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "kube-api-access-j742h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.133914 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.141388 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret" (OuterVolumeSpecName: "compute-ssh-secret") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "compute-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.162612 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.162992 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret" (OuterVolumeSpecName: "workload-ssh-secret") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "workload-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.172892 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" (UID: "1fb58f5e-1c8b-45e2-bf86-b81af58b66a9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199181 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199530 4778 reconciler_common.go:293] "Volume detached for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-compute-ssh-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199638 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199778 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199880 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.199974 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.200055 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j742h\" (UniqueName: \"kubernetes.io/projected/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-kube-api-access-j742h\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.200150 4778 reconciler_common.go:293] "Volume detached for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-workload-ssh-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.200278 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.200369 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fb58f5e-1c8b-45e2-bf86-b81af58b66a9-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.219946 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.302798 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.464094 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"1fb58f5e-1c8b-45e2-bf86-b81af58b66a9","Type":"ContainerDied","Data":"ec28f4abe6951258d35c7175f6d4f29db741687b55b7cbd44e65672620fd1045"} Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.464137 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec28f4abe6951258d35c7175f6d4f29db741687b55b7cbd44e65672620fd1045" Mar 18 10:56:21 crc kubenswrapper[4778]: I0318 10:56:21.464481 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.011218 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Mar 18 10:56:30 crc kubenswrapper[4778]: E0318 10:56:30.012409 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="extract-content" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012433 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="extract-content" Mar 18 10:56:30 crc kubenswrapper[4778]: E0318 10:56:30.012462 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="extract-utilities" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012474 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="extract-utilities" Mar 18 10:56:30 crc kubenswrapper[4778]: E0318 10:56:30.012507 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="registry-server" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012517 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="registry-server" Mar 18 10:56:30 crc kubenswrapper[4778]: E0318 10:56:30.012539 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b69a324-153a-4262-92ea-62c8b9d5928e" containerName="oc" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012550 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b69a324-153a-4262-92ea-62c8b9d5928e" containerName="oc" Mar 18 10:56:30 crc kubenswrapper[4778]: E0318 10:56:30.012569 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" containerName="ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012582 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" containerName="ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012902 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b69a324-153a-4262-92ea-62c8b9d5928e" containerName="oc" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012933 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb58f5e-1c8b-45e2-bf86-b81af58b66a9" containerName="ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.012953 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b921080-6bfb-4a4d-b453-d5e2370a7558" containerName="registry-server" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.014023 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.023215 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.203073 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mppjz\" (UniqueName: \"kubernetes.io/projected/1f57757d-6483-4e1a-9a09-e63026f73e70-kube-api-access-mppjz\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.203126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.304722 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mppjz\" (UniqueName: \"kubernetes.io/projected/1f57757d-6483-4e1a-9a09-e63026f73e70-kube-api-access-mppjz\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.304784 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.305367 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.327994 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mppjz\" (UniqueName: \"kubernetes.io/projected/1f57757d-6483-4e1a-9a09-e63026f73e70-kube-api-access-mppjz\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.330515 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"1f57757d-6483-4e1a-9a09-e63026f73e70\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.361148 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Mar 18 10:56:30 crc kubenswrapper[4778]: I0318 10:56:30.854772 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Mar 18 10:56:31 crc kubenswrapper[4778]: I0318 10:56:31.559507 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" event={"ID":"1f57757d-6483-4e1a-9a09-e63026f73e70","Type":"ContainerStarted","Data":"ec2d906ea27da463130b202e3b2a5fb6e590d46d02cf356b02ad075d9bf32c7c"} Mar 18 10:56:32 crc kubenswrapper[4778]: I0318 10:56:32.571667 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" event={"ID":"1f57757d-6483-4e1a-9a09-e63026f73e70","Type":"ContainerStarted","Data":"10186be56feafc053b6915c5ba5f3e6d045f382c5c6e998ae7af8cea176468ea"} Mar 18 10:56:32 crc kubenswrapper[4778]: I0318 10:56:32.595251 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" podStartSLOduration=3.059411259 podStartE2EDuration="3.595228787s" podCreationTimestamp="2026-03-18 10:56:29 +0000 UTC" firstStartedPulling="2026-03-18 10:56:30.848164955 +0000 UTC m=+6857.422909835" lastFinishedPulling="2026-03-18 10:56:31.383982503 +0000 UTC m=+6857.958727363" observedRunningTime="2026-03-18 10:56:32.592722449 +0000 UTC m=+6859.167467309" watchObservedRunningTime="2026-03-18 10:56:32.595228787 +0000 UTC m=+6859.169973637" Mar 18 10:56:37 crc kubenswrapper[4778]: I0318 10:56:37.415391 4778 scope.go:117] "RemoveContainer" containerID="f14aa13ab7520a46eb0a2d95277ca80a41dbb7bc18bd9145e42c1b01a855cabf" Mar 18 10:56:43 crc kubenswrapper[4778]: I0318 10:56:43.844350 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizontest-tests-horizontest"] Mar 18 10:56:43 crc kubenswrapper[4778]: I0318 10:56:43.846692 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:43 crc kubenswrapper[4778]: I0318 10:56:43.848976 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizontest-tests-horizontesthorizontest-config" Mar 18 10:56:43 crc kubenswrapper[4778]: I0318 10:56:43.849601 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"test-operator-clouds-config" Mar 18 10:56:43 crc kubenswrapper[4778]: I0318 10:56:43.861744 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizontest-tests-horizontest"] Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023694 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023756 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023816 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023880 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023915 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.023944 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8grn\" (UniqueName: \"kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.024012 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.024035 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.125837 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.125889 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.125923 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.125957 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.125986 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.126045 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.126092 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.126122 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8grn\" (UniqueName: \"kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.126767 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.127476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.127735 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.128466 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.135155 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.135608 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.147499 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.157706 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8grn\" (UniqueName: \"kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.165985 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"horizontest-tests-horizontest\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.174841 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Mar 18 10:56:44 crc kubenswrapper[4778]: W0318 10:56:44.708828 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49ff1200_d42e_4022_990d_619169f357f4.slice/crio-4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81 WatchSource:0}: Error finding container 4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81: Status 404 returned error can't find the container with id 4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81 Mar 18 10:56:44 crc kubenswrapper[4778]: I0318 10:56:44.708894 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizontest-tests-horizontest"] Mar 18 10:56:45 crc kubenswrapper[4778]: I0318 10:56:45.702439 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"49ff1200-d42e-4022-990d-619169f357f4","Type":"ContainerStarted","Data":"4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81"} Mar 18 10:57:02 crc kubenswrapper[4778]: E0318 10:57:02.708027 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizontest:current-podified" Mar 18 10:57:02 crc kubenswrapper[4778]: E0318 10:57:02.708835 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizontest-tests-horizontest,Image:quay.io/podified-antelope-centos9/openstack-horizontest:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADMIN_PASSWORD,Value:12345678,ValueFrom:nil,},EnvVar{Name:ADMIN_USERNAME,Value:admin,ValueFrom:nil,},EnvVar{Name:AUTH_URL,Value:https://keystone-public-openstack.apps-crc.testing,ValueFrom:nil,},EnvVar{Name:DASHBOARD_URL,Value:https://horizon-openstack.apps-crc.testing/,ValueFrom:nil,},EnvVar{Name:EXTRA_FLAG,Value:not pagination and test_users.py,ValueFrom:nil,},EnvVar{Name:FLAVOR_NAME,Value:m1.tiny,ValueFrom:nil,},EnvVar{Name:HORIZONTEST_DEBUG_MODE,Value:false,ValueFrom:nil,},EnvVar{Name:HORIZON_KEYS_FOLDER,Value:/etc/test_operator,ValueFrom:nil,},EnvVar{Name:HORIZON_LOGS_DIR_NAME,Value:horizon,ValueFrom:nil,},EnvVar{Name:HORIZON_REPO_BRANCH,Value:master,ValueFrom:nil,},EnvVar{Name:IMAGE_FILE,Value:/var/lib/horizontest/cirros-0.6.2-x86_64-disk.img,ValueFrom:nil,},EnvVar{Name:IMAGE_FILE_NAME,Value:cirros-0.6.2-x86_64-disk,ValueFrom:nil,},EnvVar{Name:IMAGE_URL,Value:http://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img,ValueFrom:nil,},EnvVar{Name:PASSWORD,Value:horizontest,ValueFrom:nil,},EnvVar{Name:PROJECT_NAME,Value:horizontest,ValueFrom:nil,},EnvVar{Name:PROJECT_NAME_XPATH,Value://*[@class=\"context-project\"]//ancestor::ul,ValueFrom:nil,},EnvVar{Name:REPO_URL,Value:https://review.opendev.org/openstack/horizon,ValueFrom:nil,},EnvVar{Name:USER_NAME,Value:horizontest,ValueFrom:nil,},EnvVar{Name:USE_EXTERNAL_FILES,Value:True,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{2 0} {} 2 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{1 0} {} 1 DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/horizontest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/horizontest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/var/lib/horizontest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f8grn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42455,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42455,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizontest-tests-horizontest_openstack(49ff1200-d42e-4022-990d-619169f357f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 10:57:02 crc kubenswrapper[4778]: E0318 10:57:02.710140 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizontest-tests-horizontest\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizontest-tests-horizontest" podUID="49ff1200-d42e-4022-990d-619169f357f4" Mar 18 10:57:02 crc kubenswrapper[4778]: E0318 10:57:02.932993 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizontest-tests-horizontest\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizontest:current-podified\\\"\"" pod="openstack/horizontest-tests-horizontest" podUID="49ff1200-d42e-4022-990d-619169f357f4" Mar 18 10:57:18 crc kubenswrapper[4778]: I0318 10:57:18.117796 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"49ff1200-d42e-4022-990d-619169f357f4","Type":"ContainerStarted","Data":"9bb7e83c5b0c33f61c11e78cebbab0ce419ef90ac66a563b0301647b017512a0"} Mar 18 10:57:18 crc kubenswrapper[4778]: I0318 10:57:18.153303 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizontest-tests-horizontest" podStartSLOduration=3.822479441 podStartE2EDuration="36.153279737s" podCreationTimestamp="2026-03-18 10:56:42 +0000 UTC" firstStartedPulling="2026-03-18 10:56:44.711167198 +0000 UTC m=+6871.285912038" lastFinishedPulling="2026-03-18 10:57:17.041967494 +0000 UTC m=+6903.616712334" observedRunningTime="2026-03-18 10:57:18.150745768 +0000 UTC m=+6904.725490638" watchObservedRunningTime="2026-03-18 10:57:18.153279737 +0000 UTC m=+6904.728024627" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.141029 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563858-mb4zj"] Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.143060 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.145711 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.145789 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.147105 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.147167 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.158779 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.162684 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-mb4zj"] Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.266147 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqq6j\" (UniqueName: \"kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j\") pod \"auto-csr-approver-29563858-mb4zj\" (UID: \"9e4f7f22-f4dd-4291-b26b-1a54380c3851\") " pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.368664 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqq6j\" (UniqueName: \"kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j\") pod \"auto-csr-approver-29563858-mb4zj\" (UID: \"9e4f7f22-f4dd-4291-b26b-1a54380c3851\") " pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.387986 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqq6j\" (UniqueName: \"kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j\") pod \"auto-csr-approver-29563858-mb4zj\" (UID: \"9e4f7f22-f4dd-4291-b26b-1a54380c3851\") " pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.477270 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:00 crc kubenswrapper[4778]: I0318 10:58:00.945829 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-mb4zj"] Mar 18 10:58:01 crc kubenswrapper[4778]: I0318 10:58:01.585754 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" event={"ID":"9e4f7f22-f4dd-4291-b26b-1a54380c3851","Type":"ContainerStarted","Data":"b0a8368bf55253e44d9be8d25a50cd562e2598363e672850c49a53f03d5d483c"} Mar 18 10:58:02 crc kubenswrapper[4778]: I0318 10:58:02.594321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" event={"ID":"9e4f7f22-f4dd-4291-b26b-1a54380c3851","Type":"ContainerStarted","Data":"301c31c55dd167d2c6c06a6c3d13b7a706f6ed65cd7e2a490dde753952b7fad3"} Mar 18 10:58:03 crc kubenswrapper[4778]: I0318 10:58:03.605305 4778 generic.go:334] "Generic (PLEG): container finished" podID="9e4f7f22-f4dd-4291-b26b-1a54380c3851" containerID="301c31c55dd167d2c6c06a6c3d13b7a706f6ed65cd7e2a490dde753952b7fad3" exitCode=0 Mar 18 10:58:03 crc kubenswrapper[4778]: I0318 10:58:03.605364 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" event={"ID":"9e4f7f22-f4dd-4291-b26b-1a54380c3851","Type":"ContainerDied","Data":"301c31c55dd167d2c6c06a6c3d13b7a706f6ed65cd7e2a490dde753952b7fad3"} Mar 18 10:58:04 crc kubenswrapper[4778]: I0318 10:58:04.962456 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.068014 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqq6j\" (UniqueName: \"kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j\") pod \"9e4f7f22-f4dd-4291-b26b-1a54380c3851\" (UID: \"9e4f7f22-f4dd-4291-b26b-1a54380c3851\") " Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.074499 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j" (OuterVolumeSpecName: "kube-api-access-rqq6j") pod "9e4f7f22-f4dd-4291-b26b-1a54380c3851" (UID: "9e4f7f22-f4dd-4291-b26b-1a54380c3851"). InnerVolumeSpecName "kube-api-access-rqq6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.171096 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqq6j\" (UniqueName: \"kubernetes.io/projected/9e4f7f22-f4dd-4291-b26b-1a54380c3851-kube-api-access-rqq6j\") on node \"crc\" DevicePath \"\"" Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.627043 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" event={"ID":"9e4f7f22-f4dd-4291-b26b-1a54380c3851","Type":"ContainerDied","Data":"b0a8368bf55253e44d9be8d25a50cd562e2598363e672850c49a53f03d5d483c"} Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.627104 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0a8368bf55253e44d9be8d25a50cd562e2598363e672850c49a53f03d5d483c" Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.627159 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-mb4zj" Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.700465 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-h8wf4"] Mar 18 10:58:05 crc kubenswrapper[4778]: I0318 10:58:05.712918 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-h8wf4"] Mar 18 10:58:06 crc kubenswrapper[4778]: I0318 10:58:06.198220 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902826c1-406d-4d16-8655-4a85ff4a3205" path="/var/lib/kubelet/pods/902826c1-406d-4d16-8655-4a85ff4a3205/volumes" Mar 18 10:58:30 crc kubenswrapper[4778]: I0318 10:58:30.147946 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:58:30 crc kubenswrapper[4778]: I0318 10:58:30.149797 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:58:37 crc kubenswrapper[4778]: I0318 10:58:37.565825 4778 scope.go:117] "RemoveContainer" containerID="96707a6d26a4e59376b5ccb6d995399c8158c2cfc047ca02c91e8c3ceb00d6d6" Mar 18 10:59:00 crc kubenswrapper[4778]: I0318 10:59:00.147564 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:59:00 crc kubenswrapper[4778]: I0318 10:59:00.148518 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:59:00 crc kubenswrapper[4778]: I0318 10:59:00.148583 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 10:59:00 crc kubenswrapper[4778]: I0318 10:59:00.149724 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:59:00 crc kubenswrapper[4778]: I0318 10:59:00.149823 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265" gracePeriod=600 Mar 18 10:59:01 crc kubenswrapper[4778]: I0318 10:59:01.252477 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265" exitCode=0 Mar 18 10:59:01 crc kubenswrapper[4778]: I0318 10:59:01.252517 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265"} Mar 18 10:59:01 crc kubenswrapper[4778]: I0318 10:59:01.253297 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9"} Mar 18 10:59:01 crc kubenswrapper[4778]: I0318 10:59:01.253342 4778 scope.go:117] "RemoveContainer" containerID="328d3a1f035ddb05f7bd451ba95503b71140c197b3067af842272bf94f64b8aa" Mar 18 10:59:13 crc kubenswrapper[4778]: I0318 10:59:13.387117 4778 generic.go:334] "Generic (PLEG): container finished" podID="49ff1200-d42e-4022-990d-619169f357f4" containerID="9bb7e83c5b0c33f61c11e78cebbab0ce419ef90ac66a563b0301647b017512a0" exitCode=0 Mar 18 10:59:13 crc kubenswrapper[4778]: I0318 10:59:13.387247 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"49ff1200-d42e-4022-990d-619169f357f4","Type":"ContainerDied","Data":"9bb7e83c5b0c33f61c11e78cebbab0ce419ef90ac66a563b0301647b017512a0"} Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.802775 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.899939 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8grn\" (UniqueName: \"kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900073 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900230 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900281 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900633 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900679 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900783 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.900877 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir\") pod \"49ff1200-d42e-4022-990d-619169f357f4\" (UID: \"49ff1200-d42e-4022-990d-619169f357f4\") " Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.901598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.907610 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.908745 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph" (OuterVolumeSpecName: "ceph") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.910444 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn" (OuterVolumeSpecName: "kube-api-access-f8grn") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "kube-api-access-f8grn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.945051 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.961328 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:59:14 crc kubenswrapper[4778]: I0318 10:59:14.995764 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006475 4778 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006512 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8grn\" (UniqueName: \"kubernetes.io/projected/49ff1200-d42e-4022-990d-619169f357f4-kube-api-access-f8grn\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006525 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/49ff1200-d42e-4022-990d-619169f357f4-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006535 4778 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006545 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006579 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.006592 4778 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/49ff1200-d42e-4022-990d-619169f357f4-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.033396 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.109260 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.155469 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "49ff1200-d42e-4022-990d-619169f357f4" (UID: "49ff1200-d42e-4022-990d-619169f357f4"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.211350 4778 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/49ff1200-d42e-4022-990d-619169f357f4-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.415516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"49ff1200-d42e-4022-990d-619169f357f4","Type":"ContainerDied","Data":"4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81"} Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.415747 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f8bd5ec162b5a7ad044f9435fa7f84993d2997926262260a373bb92e1fa8a81" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.415897 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.789337 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:15 crc kubenswrapper[4778]: E0318 10:59:15.789932 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ff1200-d42e-4022-990d-619169f357f4" containerName="horizontest-tests-horizontest" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.789959 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ff1200-d42e-4022-990d-619169f357f4" containerName="horizontest-tests-horizontest" Mar 18 10:59:15 crc kubenswrapper[4778]: E0318 10:59:15.789992 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e4f7f22-f4dd-4291-b26b-1a54380c3851" containerName="oc" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.790002 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e4f7f22-f4dd-4291-b26b-1a54380c3851" containerName="oc" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.790351 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ff1200-d42e-4022-990d-619169f357f4" containerName="horizontest-tests-horizontest" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.790373 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e4f7f22-f4dd-4291-b26b-1a54380c3851" containerName="oc" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.792669 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.810986 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.932305 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.932536 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m8r2\" (UniqueName: \"kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:15 crc kubenswrapper[4778]: I0318 10:59:15.932651 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.035076 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m8r2\" (UniqueName: \"kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.035142 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.035264 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.035762 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.035932 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.054063 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m8r2\" (UniqueName: \"kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2\") pod \"redhat-operators-h6pkp\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.117044 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:16 crc kubenswrapper[4778]: I0318 10:59:16.585604 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:17 crc kubenswrapper[4778]: I0318 10:59:17.434171 4778 generic.go:334] "Generic (PLEG): container finished" podID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerID="8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5" exitCode=0 Mar 18 10:59:17 crc kubenswrapper[4778]: I0318 10:59:17.434229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerDied","Data":"8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5"} Mar 18 10:59:17 crc kubenswrapper[4778]: I0318 10:59:17.434538 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerStarted","Data":"1a5621a90cc4698a3bf1daf5ec403ed97712aae84421056cd21a797fac8ef3b4"} Mar 18 10:59:19 crc kubenswrapper[4778]: I0318 10:59:19.457929 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerStarted","Data":"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e"} Mar 18 10:59:21 crc kubenswrapper[4778]: I0318 10:59:21.479113 4778 generic.go:334] "Generic (PLEG): container finished" podID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerID="f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e" exitCode=0 Mar 18 10:59:21 crc kubenswrapper[4778]: I0318 10:59:21.479175 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerDied","Data":"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e"} Mar 18 10:59:22 crc kubenswrapper[4778]: I0318 10:59:22.493108 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerStarted","Data":"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd"} Mar 18 10:59:22 crc kubenswrapper[4778]: I0318 10:59:22.520679 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h6pkp" podStartSLOduration=2.755921678 podStartE2EDuration="7.520658249s" podCreationTimestamp="2026-03-18 10:59:15 +0000 UTC" firstStartedPulling="2026-03-18 10:59:17.436051339 +0000 UTC m=+7024.010796179" lastFinishedPulling="2026-03-18 10:59:22.2007879 +0000 UTC m=+7028.775532750" observedRunningTime="2026-03-18 10:59:22.512519238 +0000 UTC m=+7029.087264088" watchObservedRunningTime="2026-03-18 10:59:22.520658249 +0000 UTC m=+7029.095403089" Mar 18 10:59:25 crc kubenswrapper[4778]: I0318 10:59:25.864857 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Mar 18 10:59:25 crc kubenswrapper[4778]: I0318 10:59:25.866649 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:25 crc kubenswrapper[4778]: I0318 10:59:25.888266 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Mar 18 10:59:25 crc kubenswrapper[4778]: I0318 10:59:25.936743 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:25 crc kubenswrapper[4778]: I0318 10:59:25.936827 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9r69\" (UniqueName: \"kubernetes.io/projected/3db5e33d-384f-4df3-bfb8-ba279b83f7e4-kube-api-access-w9r69\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.038527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9r69\" (UniqueName: \"kubernetes.io/projected/3db5e33d-384f-4df3-bfb8-ba279b83f7e4-kube-api-access-w9r69\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.038726 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.039164 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.057433 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9r69\" (UniqueName: \"kubernetes.io/projected/3db5e33d-384f-4df3-bfb8-ba279b83f7e4-kube-api-access-w9r69\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.069467 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"3db5e33d-384f-4df3-bfb8-ba279b83f7e4\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.121448 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.121496 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.211545 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Mar 18 10:59:26 crc kubenswrapper[4778]: E0318 10:59:26.212004 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 10:59:26 crc kubenswrapper[4778]: I0318 10:59:26.681260 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Mar 18 10:59:26 crc kubenswrapper[4778]: W0318 10:59:26.688733 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3db5e33d_384f_4df3_bfb8_ba279b83f7e4.slice/crio-681981faadfade7153f4a024a87d8045784570ab4504a46dfdbd1ca36458b970 WatchSource:0}: Error finding container 681981faadfade7153f4a024a87d8045784570ab4504a46dfdbd1ca36458b970: Status 404 returned error can't find the container with id 681981faadfade7153f4a024a87d8045784570ab4504a46dfdbd1ca36458b970 Mar 18 10:59:26 crc kubenswrapper[4778]: E0318 10:59:26.690245 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 10:59:27 crc kubenswrapper[4778]: E0318 10:59:27.121501 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 10:59:27 crc kubenswrapper[4778]: I0318 10:59:27.171946 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h6pkp" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="registry-server" probeResult="failure" output=< Mar 18 10:59:27 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 10:59:27 crc kubenswrapper[4778]: > Mar 18 10:59:27 crc kubenswrapper[4778]: I0318 10:59:27.561923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" event={"ID":"3db5e33d-384f-4df3-bfb8-ba279b83f7e4","Type":"ContainerStarted","Data":"937f8a9872fea869494a13c13092a7d2831b1396fbc8d1a4968641e7cbe150fc"} Mar 18 10:59:27 crc kubenswrapper[4778]: I0318 10:59:27.562338 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" event={"ID":"3db5e33d-384f-4df3-bfb8-ba279b83f7e4","Type":"ContainerStarted","Data":"681981faadfade7153f4a024a87d8045784570ab4504a46dfdbd1ca36458b970"} Mar 18 10:59:27 crc kubenswrapper[4778]: E0318 10:59:27.562582 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 10:59:27 crc kubenswrapper[4778]: I0318 10:59:27.577967 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" podStartSLOduration=2.148187717 podStartE2EDuration="2.577947737s" podCreationTimestamp="2026-03-18 10:59:25 +0000 UTC" firstStartedPulling="2026-03-18 10:59:26.691572928 +0000 UTC m=+7033.266317808" lastFinishedPulling="2026-03-18 10:59:27.121332968 +0000 UTC m=+7033.696077828" observedRunningTime="2026-03-18 10:59:27.57436956 +0000 UTC m=+7034.149114400" watchObservedRunningTime="2026-03-18 10:59:27.577947737 +0000 UTC m=+7034.152692597" Mar 18 10:59:28 crc kubenswrapper[4778]: E0318 10:59:28.570111 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 10:59:36 crc kubenswrapper[4778]: I0318 10:59:36.165977 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:36 crc kubenswrapper[4778]: I0318 10:59:36.218922 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:36 crc kubenswrapper[4778]: I0318 10:59:36.412257 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:37 crc kubenswrapper[4778]: I0318 10:59:37.653486 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h6pkp" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="registry-server" containerID="cri-o://9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd" gracePeriod=2 Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.186167 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.314728 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities\") pod \"752de958-6cfc-4ceb-84c4-006b0719f0a5\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.314984 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content\") pod \"752de958-6cfc-4ceb-84c4-006b0719f0a5\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.315032 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m8r2\" (UniqueName: \"kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2\") pod \"752de958-6cfc-4ceb-84c4-006b0719f0a5\" (UID: \"752de958-6cfc-4ceb-84c4-006b0719f0a5\") " Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.316575 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities" (OuterVolumeSpecName: "utilities") pod "752de958-6cfc-4ceb-84c4-006b0719f0a5" (UID: "752de958-6cfc-4ceb-84c4-006b0719f0a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.320211 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2" (OuterVolumeSpecName: "kube-api-access-7m8r2") pod "752de958-6cfc-4ceb-84c4-006b0719f0a5" (UID: "752de958-6cfc-4ceb-84c4-006b0719f0a5"). InnerVolumeSpecName "kube-api-access-7m8r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.417382 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.417744 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m8r2\" (UniqueName: \"kubernetes.io/projected/752de958-6cfc-4ceb-84c4-006b0719f0a5-kube-api-access-7m8r2\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.456573 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "752de958-6cfc-4ceb-84c4-006b0719f0a5" (UID: "752de958-6cfc-4ceb-84c4-006b0719f0a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.519360 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/752de958-6cfc-4ceb-84c4-006b0719f0a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.664883 4778 generic.go:334] "Generic (PLEG): container finished" podID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerID="9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd" exitCode=0 Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.664931 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerDied","Data":"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd"} Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.664962 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6pkp" event={"ID":"752de958-6cfc-4ceb-84c4-006b0719f0a5","Type":"ContainerDied","Data":"1a5621a90cc4698a3bf1daf5ec403ed97712aae84421056cd21a797fac8ef3b4"} Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.664981 4778 scope.go:117] "RemoveContainer" containerID="9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.665125 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6pkp" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.707180 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.710030 4778 scope.go:117] "RemoveContainer" containerID="f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.714800 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h6pkp"] Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.736390 4778 scope.go:117] "RemoveContainer" containerID="8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.809259 4778 scope.go:117] "RemoveContainer" containerID="9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd" Mar 18 10:59:38 crc kubenswrapper[4778]: E0318 10:59:38.809903 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd\": container with ID starting with 9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd not found: ID does not exist" containerID="9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.810136 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd"} err="failed to get container status \"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd\": rpc error: code = NotFound desc = could not find container \"9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd\": container with ID starting with 9b3355da603270f254ae886f1483b64896d524ac863a042e88f1da17e068dfcd not found: ID does not exist" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.810178 4778 scope.go:117] "RemoveContainer" containerID="f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e" Mar 18 10:59:38 crc kubenswrapper[4778]: E0318 10:59:38.810790 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e\": container with ID starting with f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e not found: ID does not exist" containerID="f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.810816 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e"} err="failed to get container status \"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e\": rpc error: code = NotFound desc = could not find container \"f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e\": container with ID starting with f2b89ef170461a19eee69decd508c3f38789f2f7d76893da1a4d25bb6b52ed2e not found: ID does not exist" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.810831 4778 scope.go:117] "RemoveContainer" containerID="8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5" Mar 18 10:59:38 crc kubenswrapper[4778]: E0318 10:59:38.811274 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5\": container with ID starting with 8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5 not found: ID does not exist" containerID="8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5" Mar 18 10:59:38 crc kubenswrapper[4778]: I0318 10:59:38.811320 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5"} err="failed to get container status \"8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5\": rpc error: code = NotFound desc = could not find container \"8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5\": container with ID starting with 8e81384ea728363bf375bcc6f86caa963e16a86b3d06d40f2b481a212d4210a5 not found: ID does not exist" Mar 18 10:59:40 crc kubenswrapper[4778]: I0318 10:59:40.209721 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" path="/var/lib/kubelet/pods/752de958-6cfc-4ceb-84c4-006b0719f0a5/volumes" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.723972 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6w9k/must-gather-5mjwn"] Mar 18 10:59:51 crc kubenswrapper[4778]: E0318 10:59:51.724997 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="extract-content" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.725009 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="extract-content" Mar 18 10:59:51 crc kubenswrapper[4778]: E0318 10:59:51.725028 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="extract-utilities" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.725035 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="extract-utilities" Mar 18 10:59:51 crc kubenswrapper[4778]: E0318 10:59:51.725048 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="registry-server" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.725054 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="registry-server" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.725317 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="752de958-6cfc-4ceb-84c4-006b0719f0a5" containerName="registry-server" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.726501 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.728726 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n6w9k"/"openshift-service-ca.crt" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.729047 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n6w9k"/"default-dockercfg-88grs" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.732401 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n6w9k/must-gather-5mjwn"] Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.736466 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n6w9k"/"kube-root-ca.crt" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.840939 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zvb6\" (UniqueName: \"kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.841020 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.943079 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zvb6\" (UniqueName: \"kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.943513 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.944416 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:51 crc kubenswrapper[4778]: I0318 10:59:51.973489 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zvb6\" (UniqueName: \"kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6\") pod \"must-gather-5mjwn\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:52 crc kubenswrapper[4778]: I0318 10:59:52.046005 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 10:59:52 crc kubenswrapper[4778]: I0318 10:59:52.598835 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n6w9k/must-gather-5mjwn"] Mar 18 10:59:52 crc kubenswrapper[4778]: I0318 10:59:52.853531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" event={"ID":"2c2e2094-7c48-4653-8b53-95483d470344","Type":"ContainerStarted","Data":"11e1f6a8bb07bab9c0aacab6604201e86370f0dcd8ed040feff79260d58c73a3"} Mar 18 10:59:59 crc kubenswrapper[4778]: I0318 10:59:59.909350 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" event={"ID":"2c2e2094-7c48-4653-8b53-95483d470344","Type":"ContainerStarted","Data":"e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec"} Mar 18 10:59:59 crc kubenswrapper[4778]: I0318 10:59:59.910008 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" event={"ID":"2c2e2094-7c48-4653-8b53-95483d470344","Type":"ContainerStarted","Data":"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa"} Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.146361 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" podStartSLOduration=2.85881927 podStartE2EDuration="9.14634151s" podCreationTimestamp="2026-03-18 10:59:51 +0000 UTC" firstStartedPulling="2026-03-18 10:59:52.594251759 +0000 UTC m=+7059.168996599" lastFinishedPulling="2026-03-18 10:59:58.881773999 +0000 UTC m=+7065.456518839" observedRunningTime="2026-03-18 10:59:59.926521926 +0000 UTC m=+7066.501266766" watchObservedRunningTime="2026-03-18 11:00:00.14634151 +0000 UTC m=+7066.721086350" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.150732 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563860-9p79f"] Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.152682 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.155806 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.155837 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.156872 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.160978 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr"] Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.162683 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.164389 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.165440 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.181034 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr"] Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.216333 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.216405 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.216506 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t76pg\" (UniqueName: \"kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.216547 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c2v2\" (UniqueName: \"kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2\") pod \"auto-csr-approver-29563860-9p79f\" (UID: \"9bbe37de-66b2-4c42-a72f-92155eb2edb9\") " pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.266761 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-9p79f"] Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.317978 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.318135 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t76pg\" (UniqueName: \"kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.318209 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c2v2\" (UniqueName: \"kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2\") pod \"auto-csr-approver-29563860-9p79f\" (UID: \"9bbe37de-66b2-4c42-a72f-92155eb2edb9\") " pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.318330 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.319476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.330909 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.334038 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t76pg\" (UniqueName: \"kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg\") pod \"collect-profiles-29563860-8l9tr\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.334992 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c2v2\" (UniqueName: \"kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2\") pod \"auto-csr-approver-29563860-9p79f\" (UID: \"9bbe37de-66b2-4c42-a72f-92155eb2edb9\") " pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.480253 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.508593 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:00 crc kubenswrapper[4778]: I0318 11:00:00.965124 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-9p79f"] Mar 18 11:00:01 crc kubenswrapper[4778]: I0318 11:00:01.053384 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr"] Mar 18 11:00:01 crc kubenswrapper[4778]: W0318 11:00:01.053634 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda112dd3e_72a0_48ea_a69c_448090520236.slice/crio-a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b WatchSource:0}: Error finding container a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b: Status 404 returned error can't find the container with id a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b Mar 18 11:00:01 crc kubenswrapper[4778]: I0318 11:00:01.932367 4778 generic.go:334] "Generic (PLEG): container finished" podID="a112dd3e-72a0-48ea-a69c-448090520236" containerID="8bd79fe1dda5f124cb0c95449837cc849410b83946a1ead7147f36344ba1810d" exitCode=0 Mar 18 11:00:01 crc kubenswrapper[4778]: I0318 11:00:01.932487 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" event={"ID":"a112dd3e-72a0-48ea-a69c-448090520236","Type":"ContainerDied","Data":"8bd79fe1dda5f124cb0c95449837cc849410b83946a1ead7147f36344ba1810d"} Mar 18 11:00:01 crc kubenswrapper[4778]: I0318 11:00:01.932818 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" event={"ID":"a112dd3e-72a0-48ea-a69c-448090520236","Type":"ContainerStarted","Data":"a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b"} Mar 18 11:00:01 crc kubenswrapper[4778]: I0318 11:00:01.934018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563860-9p79f" event={"ID":"9bbe37de-66b2-4c42-a72f-92155eb2edb9","Type":"ContainerStarted","Data":"8afcfa7ffbe12d4a317cb693519716b9c3977186844e3596a5ac00ca0f3c4061"} Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.364569 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.381039 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t76pg\" (UniqueName: \"kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg\") pod \"a112dd3e-72a0-48ea-a69c-448090520236\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.381437 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume\") pod \"a112dd3e-72a0-48ea-a69c-448090520236\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.381501 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume\") pod \"a112dd3e-72a0-48ea-a69c-448090520236\" (UID: \"a112dd3e-72a0-48ea-a69c-448090520236\") " Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.382728 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume" (OuterVolumeSpecName: "config-volume") pod "a112dd3e-72a0-48ea-a69c-448090520236" (UID: "a112dd3e-72a0-48ea-a69c-448090520236"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.390110 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg" (OuterVolumeSpecName: "kube-api-access-t76pg") pod "a112dd3e-72a0-48ea-a69c-448090520236" (UID: "a112dd3e-72a0-48ea-a69c-448090520236"). InnerVolumeSpecName "kube-api-access-t76pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.391273 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a112dd3e-72a0-48ea-a69c-448090520236" (UID: "a112dd3e-72a0-48ea-a69c-448090520236"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.487922 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t76pg\" (UniqueName: \"kubernetes.io/projected/a112dd3e-72a0-48ea-a69c-448090520236-kube-api-access-t76pg\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.487974 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a112dd3e-72a0-48ea-a69c-448090520236-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.487992 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a112dd3e-72a0-48ea-a69c-448090520236-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.954933 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" event={"ID":"a112dd3e-72a0-48ea-a69c-448090520236","Type":"ContainerDied","Data":"a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b"} Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.955239 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8cfd153fe3e22eb43dc06184206855dd94fe8c85cc237bd5028b0ea5da00d1b" Mar 18 11:00:03 crc kubenswrapper[4778]: I0318 11:00:03.955016 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-8l9tr" Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.432237 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf"] Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.441688 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-xdqjf"] Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.871127 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-cmwv6"] Mar 18 11:00:04 crc kubenswrapper[4778]: E0318 11:00:04.871615 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a112dd3e-72a0-48ea-a69c-448090520236" containerName="collect-profiles" Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.871631 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a112dd3e-72a0-48ea-a69c-448090520236" containerName="collect-profiles" Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.871800 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a112dd3e-72a0-48ea-a69c-448090520236" containerName="collect-profiles" Mar 18 11:00:04 crc kubenswrapper[4778]: I0318 11:00:04.872411 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.021748 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.021810 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbgjp\" (UniqueName: \"kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.123740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.124085 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbgjp\" (UniqueName: \"kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.124574 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.144852 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbgjp\" (UniqueName: \"kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp\") pod \"crc-debug-cmwv6\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.193403 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:00:05 crc kubenswrapper[4778]: I0318 11:00:05.972377 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" event={"ID":"c15df1c1-2c25-4e82-9933-ada0bd8d6d73","Type":"ContainerStarted","Data":"7b962272cb27c5f171848527e65fc41e0d889940197edbf9741702f400418883"} Mar 18 11:00:06 crc kubenswrapper[4778]: I0318 11:00:06.203328 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a12e64d-d433-4f42-8aa6-cd1de264b346" path="/var/lib/kubelet/pods/1a12e64d-d433-4f42-8aa6-cd1de264b346/volumes" Mar 18 11:00:18 crc kubenswrapper[4778]: I0318 11:00:18.112228 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" event={"ID":"c15df1c1-2c25-4e82-9933-ada0bd8d6d73","Type":"ContainerStarted","Data":"3fd6d1f0823c046b310eb8beb463813250215ced92cb6e72ad8093250484ff70"} Mar 18 11:00:18 crc kubenswrapper[4778]: I0318 11:00:18.142482 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" podStartSLOduration=2.478719784 podStartE2EDuration="14.142464285s" podCreationTimestamp="2026-03-18 11:00:04 +0000 UTC" firstStartedPulling="2026-03-18 11:00:05.267958393 +0000 UTC m=+7071.842703233" lastFinishedPulling="2026-03-18 11:00:16.931702894 +0000 UTC m=+7083.506447734" observedRunningTime="2026-03-18 11:00:18.134036016 +0000 UTC m=+7084.708780886" watchObservedRunningTime="2026-03-18 11:00:18.142464285 +0000 UTC m=+7084.717209125" Mar 18 11:00:19 crc kubenswrapper[4778]: I0318 11:00:19.122116 4778 generic.go:334] "Generic (PLEG): container finished" podID="9bbe37de-66b2-4c42-a72f-92155eb2edb9" containerID="1bb8821bd2c18d4bb5e7f9c4c0784d606dc27180e5e74bcaf381cd0d404e43fd" exitCode=0 Mar 18 11:00:19 crc kubenswrapper[4778]: I0318 11:00:19.122165 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563860-9p79f" event={"ID":"9bbe37de-66b2-4c42-a72f-92155eb2edb9","Type":"ContainerDied","Data":"1bb8821bd2c18d4bb5e7f9c4c0784d606dc27180e5e74bcaf381cd0d404e43fd"} Mar 18 11:00:20 crc kubenswrapper[4778]: I0318 11:00:20.455010 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:20 crc kubenswrapper[4778]: I0318 11:00:20.643473 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c2v2\" (UniqueName: \"kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2\") pod \"9bbe37de-66b2-4c42-a72f-92155eb2edb9\" (UID: \"9bbe37de-66b2-4c42-a72f-92155eb2edb9\") " Mar 18 11:00:20 crc kubenswrapper[4778]: I0318 11:00:20.653112 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2" (OuterVolumeSpecName: "kube-api-access-9c2v2") pod "9bbe37de-66b2-4c42-a72f-92155eb2edb9" (UID: "9bbe37de-66b2-4c42-a72f-92155eb2edb9"). InnerVolumeSpecName "kube-api-access-9c2v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:00:20 crc kubenswrapper[4778]: I0318 11:00:20.746704 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c2v2\" (UniqueName: \"kubernetes.io/projected/9bbe37de-66b2-4c42-a72f-92155eb2edb9-kube-api-access-9c2v2\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:21 crc kubenswrapper[4778]: I0318 11:00:21.140658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563860-9p79f" event={"ID":"9bbe37de-66b2-4c42-a72f-92155eb2edb9","Type":"ContainerDied","Data":"8afcfa7ffbe12d4a317cb693519716b9c3977186844e3596a5ac00ca0f3c4061"} Mar 18 11:00:21 crc kubenswrapper[4778]: I0318 11:00:21.140934 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8afcfa7ffbe12d4a317cb693519716b9c3977186844e3596a5ac00ca0f3c4061" Mar 18 11:00:21 crc kubenswrapper[4778]: I0318 11:00:21.140716 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-9p79f" Mar 18 11:00:21 crc kubenswrapper[4778]: I0318 11:00:21.528136 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-qqn9z"] Mar 18 11:00:21 crc kubenswrapper[4778]: I0318 11:00:21.537106 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-qqn9z"] Mar 18 11:00:22 crc kubenswrapper[4778]: I0318 11:00:22.197390 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6103ea7-c41a-40d2-ae16-15f066c955b9" path="/var/lib/kubelet/pods/f6103ea7-c41a-40d2-ae16-15f066c955b9/volumes" Mar 18 11:00:37 crc kubenswrapper[4778]: I0318 11:00:37.733562 4778 scope.go:117] "RemoveContainer" containerID="20d0876a2852471421fd6830f32cec9b8955b6abdc480edeb7e2a46c81a72c97" Mar 18 11:00:37 crc kubenswrapper[4778]: I0318 11:00:37.782820 4778 scope.go:117] "RemoveContainer" containerID="4c308b5ed19066acb80d31a8263c3f25bb04c0935256cdfa497ae0b275b40ad3" Mar 18 11:00:42 crc kubenswrapper[4778]: E0318 11:00:42.187774 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.147237 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.148928 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.154217 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29563861-czdsg"] Mar 18 11:01:00 crc kubenswrapper[4778]: E0318 11:01:00.154654 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbe37de-66b2-4c42-a72f-92155eb2edb9" containerName="oc" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.154676 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbe37de-66b2-4c42-a72f-92155eb2edb9" containerName="oc" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.154927 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bbe37de-66b2-4c42-a72f-92155eb2edb9" containerName="oc" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.155687 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.163999 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563861-czdsg"] Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.337479 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbgmk\" (UniqueName: \"kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.337581 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.337707 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.337735 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.439066 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.439118 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.439192 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbgmk\" (UniqueName: \"kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.439243 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.444834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.445190 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.446649 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.464531 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbgmk\" (UniqueName: \"kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk\") pod \"keystone-cron-29563861-czdsg\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.475214 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:00 crc kubenswrapper[4778]: I0318 11:01:00.982397 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563861-czdsg"] Mar 18 11:01:01 crc kubenswrapper[4778]: I0318 11:01:01.546864 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563861-czdsg" event={"ID":"d34b9add-0199-4bf1-81f8-fa4c2a9138e7","Type":"ContainerStarted","Data":"f984a2631cb84423ec49d0a74a80858b922aaf45ecd4a6618c404c384cca758e"} Mar 18 11:01:01 crc kubenswrapper[4778]: I0318 11:01:01.547215 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563861-czdsg" event={"ID":"d34b9add-0199-4bf1-81f8-fa4c2a9138e7","Type":"ContainerStarted","Data":"2316c1f8492c44ecfcb6c8739c1c26728d3712b4d7998cdb3a69536131c4537f"} Mar 18 11:01:01 crc kubenswrapper[4778]: I0318 11:01:01.563288 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29563861-czdsg" podStartSLOduration=1.563268566 podStartE2EDuration="1.563268566s" podCreationTimestamp="2026-03-18 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:01:01.558917967 +0000 UTC m=+7128.133662837" watchObservedRunningTime="2026-03-18 11:01:01.563268566 +0000 UTC m=+7128.138013406" Mar 18 11:01:03 crc kubenswrapper[4778]: I0318 11:01:03.565230 4778 generic.go:334] "Generic (PLEG): container finished" podID="c15df1c1-2c25-4e82-9933-ada0bd8d6d73" containerID="3fd6d1f0823c046b310eb8beb463813250215ced92cb6e72ad8093250484ff70" exitCode=0 Mar 18 11:01:03 crc kubenswrapper[4778]: I0318 11:01:03.565335 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" event={"ID":"c15df1c1-2c25-4e82-9933-ada0bd8d6d73","Type":"ContainerDied","Data":"3fd6d1f0823c046b310eb8beb463813250215ced92cb6e72ad8093250484ff70"} Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.575052 4778 generic.go:334] "Generic (PLEG): container finished" podID="d34b9add-0199-4bf1-81f8-fa4c2a9138e7" containerID="f984a2631cb84423ec49d0a74a80858b922aaf45ecd4a6618c404c384cca758e" exitCode=0 Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.575235 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563861-czdsg" event={"ID":"d34b9add-0199-4bf1-81f8-fa4c2a9138e7","Type":"ContainerDied","Data":"f984a2631cb84423ec49d0a74a80858b922aaf45ecd4a6618c404c384cca758e"} Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.706414 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.763132 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-cmwv6"] Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.775496 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-cmwv6"] Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.825490 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host\") pod \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.825585 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host" (OuterVolumeSpecName: "host") pod "c15df1c1-2c25-4e82-9933-ada0bd8d6d73" (UID: "c15df1c1-2c25-4e82-9933-ada0bd8d6d73"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.825730 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbgjp\" (UniqueName: \"kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp\") pod \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\" (UID: \"c15df1c1-2c25-4e82-9933-ada0bd8d6d73\") " Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.826272 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.831500 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp" (OuterVolumeSpecName: "kube-api-access-dbgjp") pod "c15df1c1-2c25-4e82-9933-ada0bd8d6d73" (UID: "c15df1c1-2c25-4e82-9933-ada0bd8d6d73"). InnerVolumeSpecName "kube-api-access-dbgjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:01:04 crc kubenswrapper[4778]: I0318 11:01:04.928575 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbgjp\" (UniqueName: \"kubernetes.io/projected/c15df1c1-2c25-4e82-9933-ada0bd8d6d73-kube-api-access-dbgjp\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.584949 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b962272cb27c5f171848527e65fc41e0d889940197edbf9741702f400418883" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.585050 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-cmwv6" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.933662 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.987682 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-gvcbm"] Mar 18 11:01:05 crc kubenswrapper[4778]: E0318 11:01:05.988189 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c15df1c1-2c25-4e82-9933-ada0bd8d6d73" containerName="container-00" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.988232 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c15df1c1-2c25-4e82-9933-ada0bd8d6d73" containerName="container-00" Mar 18 11:01:05 crc kubenswrapper[4778]: E0318 11:01:05.988279 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34b9add-0199-4bf1-81f8-fa4c2a9138e7" containerName="keystone-cron" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.988290 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34b9add-0199-4bf1-81f8-fa4c2a9138e7" containerName="keystone-cron" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.988509 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c15df1c1-2c25-4e82-9933-ada0bd8d6d73" containerName="container-00" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.988539 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34b9add-0199-4bf1-81f8-fa4c2a9138e7" containerName="keystone-cron" Mar 18 11:01:05 crc kubenswrapper[4778]: I0318 11:01:05.989405 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.061293 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data\") pod \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.061349 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys\") pod \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.062369 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle\") pod \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.062420 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbgmk\" (UniqueName: \"kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk\") pod \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\" (UID: \"d34b9add-0199-4bf1-81f8-fa4c2a9138e7\") " Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.066952 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d34b9add-0199-4bf1-81f8-fa4c2a9138e7" (UID: "d34b9add-0199-4bf1-81f8-fa4c2a9138e7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.068792 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk" (OuterVolumeSpecName: "kube-api-access-sbgmk") pod "d34b9add-0199-4bf1-81f8-fa4c2a9138e7" (UID: "d34b9add-0199-4bf1-81f8-fa4c2a9138e7"). InnerVolumeSpecName "kube-api-access-sbgmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.098228 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d34b9add-0199-4bf1-81f8-fa4c2a9138e7" (UID: "d34b9add-0199-4bf1-81f8-fa4c2a9138e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.111090 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data" (OuterVolumeSpecName: "config-data") pod "d34b9add-0199-4bf1-81f8-fa4c2a9138e7" (UID: "d34b9add-0199-4bf1-81f8-fa4c2a9138e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.164711 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpg7b\" (UniqueName: \"kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.164996 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.165104 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.165119 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbgmk\" (UniqueName: \"kubernetes.io/projected/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-kube-api-access-sbgmk\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.165145 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.165152 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d34b9add-0199-4bf1-81f8-fa4c2a9138e7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.206179 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c15df1c1-2c25-4e82-9933-ada0bd8d6d73" path="/var/lib/kubelet/pods/c15df1c1-2c25-4e82-9933-ada0bd8d6d73/volumes" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.266990 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.267044 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpg7b\" (UniqueName: \"kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.267186 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.298286 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpg7b\" (UniqueName: \"kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b\") pod \"crc-debug-gvcbm\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.315608 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.596387 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" event={"ID":"0f322526-d81e-4a2e-a084-151cb4304b64","Type":"ContainerStarted","Data":"4885ccf444eddc8a6ce122274d5f1476536fe5db0e3a880ac80a8141660c536e"} Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.596724 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" event={"ID":"0f322526-d81e-4a2e-a084-151cb4304b64","Type":"ContainerStarted","Data":"eafbdfb496b252d70cfdde0e33b23047271e7bcbfeb28f1dc5fe3cb1ad3e962d"} Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.599053 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563861-czdsg" event={"ID":"d34b9add-0199-4bf1-81f8-fa4c2a9138e7","Type":"ContainerDied","Data":"2316c1f8492c44ecfcb6c8739c1c26728d3712b4d7998cdb3a69536131c4537f"} Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.599091 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2316c1f8492c44ecfcb6c8739c1c26728d3712b4d7998cdb3a69536131c4537f" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.599126 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563861-czdsg" Mar 18 11:01:06 crc kubenswrapper[4778]: I0318 11:01:06.622570 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" podStartSLOduration=1.622550769 podStartE2EDuration="1.622550769s" podCreationTimestamp="2026-03-18 11:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:01:06.614347836 +0000 UTC m=+7133.189092706" watchObservedRunningTime="2026-03-18 11:01:06.622550769 +0000 UTC m=+7133.197295609" Mar 18 11:01:07 crc kubenswrapper[4778]: I0318 11:01:07.609104 4778 generic.go:334] "Generic (PLEG): container finished" podID="0f322526-d81e-4a2e-a084-151cb4304b64" containerID="4885ccf444eddc8a6ce122274d5f1476536fe5db0e3a880ac80a8141660c536e" exitCode=0 Mar 18 11:01:07 crc kubenswrapper[4778]: I0318 11:01:07.609135 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" event={"ID":"0f322526-d81e-4a2e-a084-151cb4304b64","Type":"ContainerDied","Data":"4885ccf444eddc8a6ce122274d5f1476536fe5db0e3a880ac80a8141660c536e"} Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.716167 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.817833 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host\") pod \"0f322526-d81e-4a2e-a084-151cb4304b64\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.817936 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpg7b\" (UniqueName: \"kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b\") pod \"0f322526-d81e-4a2e-a084-151cb4304b64\" (UID: \"0f322526-d81e-4a2e-a084-151cb4304b64\") " Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.819264 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host" (OuterVolumeSpecName: "host") pod "0f322526-d81e-4a2e-a084-151cb4304b64" (UID: "0f322526-d81e-4a2e-a084-151cb4304b64"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.828841 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b" (OuterVolumeSpecName: "kube-api-access-fpg7b") pod "0f322526-d81e-4a2e-a084-151cb4304b64" (UID: "0f322526-d81e-4a2e-a084-151cb4304b64"). InnerVolumeSpecName "kube-api-access-fpg7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.920237 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f322526-d81e-4a2e-a084-151cb4304b64-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:08 crc kubenswrapper[4778]: I0318 11:01:08.920273 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpg7b\" (UniqueName: \"kubernetes.io/projected/0f322526-d81e-4a2e-a084-151cb4304b64-kube-api-access-fpg7b\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:09 crc kubenswrapper[4778]: I0318 11:01:09.250008 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-gvcbm"] Mar 18 11:01:09 crc kubenswrapper[4778]: I0318 11:01:09.265593 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-gvcbm"] Mar 18 11:01:09 crc kubenswrapper[4778]: I0318 11:01:09.625365 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eafbdfb496b252d70cfdde0e33b23047271e7bcbfeb28f1dc5fe3cb1ad3e962d" Mar 18 11:01:09 crc kubenswrapper[4778]: I0318 11:01:09.625497 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-gvcbm" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.207067 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f322526-d81e-4a2e-a084-151cb4304b64" path="/var/lib/kubelet/pods/0f322526-d81e-4a2e-a084-151cb4304b64/volumes" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.438584 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-2nws9"] Mar 18 11:01:10 crc kubenswrapper[4778]: E0318 11:01:10.439965 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f322526-d81e-4a2e-a084-151cb4304b64" containerName="container-00" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.440120 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f322526-d81e-4a2e-a084-151cb4304b64" containerName="container-00" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.440548 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f322526-d81e-4a2e-a084-151cb4304b64" containerName="container-00" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.441879 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.454168 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcq5b\" (UniqueName: \"kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.454278 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.556043 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcq5b\" (UniqueName: \"kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.556480 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.556715 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.592776 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcq5b\" (UniqueName: \"kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b\") pod \"crc-debug-2nws9\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: I0318 11:01:10.767580 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:10 crc kubenswrapper[4778]: W0318 11:01:10.799467 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7738bdaf_c632_4ce0_b83d_cb4d38c4760a.slice/crio-807c35c264101c5aee2828e057a54c122affe04218766c98c693137411008edc WatchSource:0}: Error finding container 807c35c264101c5aee2828e057a54c122affe04218766c98c693137411008edc: Status 404 returned error can't find the container with id 807c35c264101c5aee2828e057a54c122affe04218766c98c693137411008edc Mar 18 11:01:11 crc kubenswrapper[4778]: I0318 11:01:11.648298 4778 generic.go:334] "Generic (PLEG): container finished" podID="7738bdaf-c632-4ce0-b83d-cb4d38c4760a" containerID="f5477be7293a466e680a1b2c883902065f2398c01a4cd1962f22034d098fc2a3" exitCode=0 Mar 18 11:01:11 crc kubenswrapper[4778]: I0318 11:01:11.648373 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" event={"ID":"7738bdaf-c632-4ce0-b83d-cb4d38c4760a","Type":"ContainerDied","Data":"f5477be7293a466e680a1b2c883902065f2398c01a4cd1962f22034d098fc2a3"} Mar 18 11:01:11 crc kubenswrapper[4778]: I0318 11:01:11.648593 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" event={"ID":"7738bdaf-c632-4ce0-b83d-cb4d38c4760a","Type":"ContainerStarted","Data":"807c35c264101c5aee2828e057a54c122affe04218766c98c693137411008edc"} Mar 18 11:01:11 crc kubenswrapper[4778]: I0318 11:01:11.694173 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-2nws9"] Mar 18 11:01:11 crc kubenswrapper[4778]: I0318 11:01:11.702795 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6w9k/crc-debug-2nws9"] Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.762175 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.803680 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcq5b\" (UniqueName: \"kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b\") pod \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.803860 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host\") pod \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\" (UID: \"7738bdaf-c632-4ce0-b83d-cb4d38c4760a\") " Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.804081 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host" (OuterVolumeSpecName: "host") pod "7738bdaf-c632-4ce0-b83d-cb4d38c4760a" (UID: "7738bdaf-c632-4ce0-b83d-cb4d38c4760a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.805313 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.813641 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b" (OuterVolumeSpecName: "kube-api-access-bcq5b") pod "7738bdaf-c632-4ce0-b83d-cb4d38c4760a" (UID: "7738bdaf-c632-4ce0-b83d-cb4d38c4760a"). InnerVolumeSpecName "kube-api-access-bcq5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:01:12 crc kubenswrapper[4778]: I0318 11:01:12.907031 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcq5b\" (UniqueName: \"kubernetes.io/projected/7738bdaf-c632-4ce0-b83d-cb4d38c4760a-kube-api-access-bcq5b\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:13 crc kubenswrapper[4778]: I0318 11:01:13.672727 4778 scope.go:117] "RemoveContainer" containerID="f5477be7293a466e680a1b2c883902065f2398c01a4cd1962f22034d098fc2a3" Mar 18 11:01:13 crc kubenswrapper[4778]: I0318 11:01:13.672791 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/crc-debug-2nws9" Mar 18 11:01:14 crc kubenswrapper[4778]: I0318 11:01:14.217792 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7738bdaf-c632-4ce0-b83d-cb4d38c4760a" path="/var/lib/kubelet/pods/7738bdaf-c632-4ce0-b83d-cb4d38c4760a/volumes" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.803649 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:29 crc kubenswrapper[4778]: E0318 11:01:29.804796 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7738bdaf-c632-4ce0-b83d-cb4d38c4760a" containerName="container-00" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.804810 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7738bdaf-c632-4ce0-b83d-cb4d38c4760a" containerName="container-00" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.805022 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7738bdaf-c632-4ce0-b83d-cb4d38c4760a" containerName="container-00" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.806511 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.826630 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.982946 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wxbd\" (UniqueName: \"kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.983083 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:29 crc kubenswrapper[4778]: I0318 11:01:29.983159 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.084925 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wxbd\" (UniqueName: \"kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.085262 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.085333 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.085844 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.085940 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.114082 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wxbd\" (UniqueName: \"kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd\") pod \"certified-operators-2hjj7\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.139778 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.147773 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.147818 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.649572 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:30 crc kubenswrapper[4778]: I0318 11:01:30.858712 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerStarted","Data":"6796f39f1936165c0d34446a4399a251eaff83374f66c18a58cb0c062de2237f"} Mar 18 11:01:31 crc kubenswrapper[4778]: I0318 11:01:31.870037 4778 generic.go:334] "Generic (PLEG): container finished" podID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerID="8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984" exitCode=0 Mar 18 11:01:31 crc kubenswrapper[4778]: I0318 11:01:31.870090 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerDied","Data":"8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984"} Mar 18 11:01:31 crc kubenswrapper[4778]: I0318 11:01:31.874665 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 11:01:32 crc kubenswrapper[4778]: I0318 11:01:32.883215 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerStarted","Data":"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157"} Mar 18 11:01:34 crc kubenswrapper[4778]: I0318 11:01:34.904766 4778 generic.go:334] "Generic (PLEG): container finished" podID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerID="2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157" exitCode=0 Mar 18 11:01:34 crc kubenswrapper[4778]: I0318 11:01:34.904844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerDied","Data":"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157"} Mar 18 11:01:35 crc kubenswrapper[4778]: I0318 11:01:35.918932 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerStarted","Data":"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0"} Mar 18 11:01:35 crc kubenswrapper[4778]: I0318 11:01:35.949637 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hjj7" podStartSLOduration=3.294141401 podStartE2EDuration="6.949614355s" podCreationTimestamp="2026-03-18 11:01:29 +0000 UTC" firstStartedPulling="2026-03-18 11:01:31.874365411 +0000 UTC m=+7158.449110271" lastFinishedPulling="2026-03-18 11:01:35.529838385 +0000 UTC m=+7162.104583225" observedRunningTime="2026-03-18 11:01:35.938526874 +0000 UTC m=+7162.513271744" watchObservedRunningTime="2026-03-18 11:01:35.949614355 +0000 UTC m=+7162.524359205" Mar 18 11:01:40 crc kubenswrapper[4778]: I0318 11:01:40.140583 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:40 crc kubenswrapper[4778]: I0318 11:01:40.141355 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:41 crc kubenswrapper[4778]: I0318 11:01:41.185320 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-2hjj7" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="registry-server" probeResult="failure" output=< Mar 18 11:01:41 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 11:01:41 crc kubenswrapper[4778]: > Mar 18 11:01:46 crc kubenswrapper[4778]: E0318 11:01:46.190323 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:01:50 crc kubenswrapper[4778]: I0318 11:01:50.206529 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:50 crc kubenswrapper[4778]: I0318 11:01:50.269470 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:50 crc kubenswrapper[4778]: I0318 11:01:50.875340 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.109613 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hjj7" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="registry-server" containerID="cri-o://70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0" gracePeriod=2 Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.621744 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.684940 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wxbd\" (UniqueName: \"kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd\") pod \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.685047 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content\") pod \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.685168 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities\") pod \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\" (UID: \"abcddae8-6cd1-4a48-b133-af298a8fc9bb\") " Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.685802 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities" (OuterVolumeSpecName: "utilities") pod "abcddae8-6cd1-4a48-b133-af298a8fc9bb" (UID: "abcddae8-6cd1-4a48-b133-af298a8fc9bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.697388 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd" (OuterVolumeSpecName: "kube-api-access-5wxbd") pod "abcddae8-6cd1-4a48-b133-af298a8fc9bb" (UID: "abcddae8-6cd1-4a48-b133-af298a8fc9bb"). InnerVolumeSpecName "kube-api-access-5wxbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.749130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abcddae8-6cd1-4a48-b133-af298a8fc9bb" (UID: "abcddae8-6cd1-4a48-b133-af298a8fc9bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.787122 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.787169 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wxbd\" (UniqueName: \"kubernetes.io/projected/abcddae8-6cd1-4a48-b133-af298a8fc9bb-kube-api-access-5wxbd\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:52 crc kubenswrapper[4778]: I0318 11:01:52.787184 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abcddae8-6cd1-4a48-b133-af298a8fc9bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.125847 4778 generic.go:334] "Generic (PLEG): container finished" podID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerID="70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0" exitCode=0 Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.125885 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerDied","Data":"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0"} Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.125913 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hjj7" event={"ID":"abcddae8-6cd1-4a48-b133-af298a8fc9bb","Type":"ContainerDied","Data":"6796f39f1936165c0d34446a4399a251eaff83374f66c18a58cb0c062de2237f"} Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.125932 4778 scope.go:117] "RemoveContainer" containerID="70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.125955 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hjj7" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.160363 4778 scope.go:117] "RemoveContainer" containerID="2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.195144 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.203715 4778 scope.go:117] "RemoveContainer" containerID="8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.207967 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hjj7"] Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.234071 4778 scope.go:117] "RemoveContainer" containerID="70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0" Mar 18 11:01:53 crc kubenswrapper[4778]: E0318 11:01:53.234839 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0\": container with ID starting with 70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0 not found: ID does not exist" containerID="70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.234866 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0"} err="failed to get container status \"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0\": rpc error: code = NotFound desc = could not find container \"70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0\": container with ID starting with 70a1afb98abcef8170b52405a81bb16b9b8bbaa13d74578d0a9d61ac5d96c4b0 not found: ID does not exist" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.234891 4778 scope.go:117] "RemoveContainer" containerID="2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157" Mar 18 11:01:53 crc kubenswrapper[4778]: E0318 11:01:53.235231 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157\": container with ID starting with 2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157 not found: ID does not exist" containerID="2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.235275 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157"} err="failed to get container status \"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157\": rpc error: code = NotFound desc = could not find container \"2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157\": container with ID starting with 2c7371952c18bfd966db38c95cd7597e102d0918baa6681d7bfd15508b08e157 not found: ID does not exist" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.235300 4778 scope.go:117] "RemoveContainer" containerID="8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984" Mar 18 11:01:53 crc kubenswrapper[4778]: E0318 11:01:53.235559 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984\": container with ID starting with 8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984 not found: ID does not exist" containerID="8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.235585 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984"} err="failed to get container status \"8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984\": rpc error: code = NotFound desc = could not find container \"8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984\": container with ID starting with 8144540623de1f6abcdd48dc4e7a8527ac5c7353174d55be341db0362f442984 not found: ID does not exist" Mar 18 11:01:53 crc kubenswrapper[4778]: I0318 11:01:53.974712 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ansibletest-ansibletest_1fb58f5e-1c8b-45e2-bf86-b81af58b66a9/ansibletest-ansibletest/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.115115 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d7458cd-cb86l_e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55/barbican-api/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.142033 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d7458cd-cb86l_e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55/barbican-api-log/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.197644 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" path="/var/lib/kubelet/pods/abcddae8-6cd1-4a48-b133-af298a8fc9bb/volumes" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.273727 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8bc77f476-tw7vd_3a006670-1a48-4421-8471-dd961c0e1d4c/barbican-keystone-listener/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.468884 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-769d964c9f-nxhk2_eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b/barbican-worker/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.496646 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-769d964c9f-nxhk2_eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b/barbican-worker-log/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.696672 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk_f4bddd5e-314b-49c0-963c-107e6798c40e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.861472 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8bc77f476-tw7vd_3a006670-1a48-4421-8471-dd961c0e1d4c/barbican-keystone-listener-log/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.905299 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/ceilometer-central-agent/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.955386 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/ceilometer-notification-agent/0.log" Mar 18 11:01:54 crc kubenswrapper[4778]: I0318 11:01:54.978741 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/proxy-httpd/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.037922 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/sg-core/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.131850 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv_fed5a515-ed14-40f1-9282-4e87fe319bf6/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.251912 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl_34acd7f6-6263-4871-892c-02835ebbab27/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.451593 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51/cinder-api/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.508251 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51/cinder-api-log/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.696990 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a419ad60-27c7-4a74-a7a0-f6b04b3bcb13/cinder-backup/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.754270 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a419ad60-27c7-4a74-a7a0-f6b04b3bcb13/probe/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.764502 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_bbde13ad-dacc-4f17-8da3-109ede6972c0/cinder-scheduler/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.913514 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_bbde13ad-dacc-4f17-8da3-109ede6972c0/probe/0.log" Mar 18 11:01:55 crc kubenswrapper[4778]: I0318 11:01:55.997669 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_81d18509-d2fc-47e2-b814-94c4807a4dd6/probe/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.003009 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_81d18509-d2fc-47e2-b814-94c4807a4dd6/cinder-volume/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.245262 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5_d44d6afe-0030-4d9d-9fa7-f75274eff578/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.247425 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-r64nk_4f5bf2d2-78b2-4358-a582-482ab3020da3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.409138 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/init/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.643793 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/init/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.725188 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd/glance-httpd/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.828441 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/dnsmasq-dns/0.log" Mar 18 11:01:56 crc kubenswrapper[4778]: I0318 11:01:56.877180 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd/glance-log/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.053340 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a18a46b5-39a7-4da9-8994-5c4716bc0fc3/glance-log/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.059401 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a18a46b5-39a7-4da9-8994-5c4716bc0fc3/glance-httpd/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.199359 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-644f48df4-b7jhq_e0a0a638-c445-4931-861e-d35704487c97/horizon/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.391714 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizontest-tests-horizontest_49ff1200-d42e-4022-990d-619169f357f4/horizontest-tests-horizontest/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.603539 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gfphd_7c70009e-cfb3-4598-9ae4-f1d90a2a63d5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.748173 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-9zw82_5e5ffed6-fceb-4d38-aa29-e9836a8d9f50/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:57 crc kubenswrapper[4778]: I0318 11:01:57.986683 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29563801-nctwn_8ace9f11-f4d8-4801-afa2-5b723d52d41e/keystone-cron/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.176231 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29563861-czdsg_d34b9add-0199-4bf1-81f8-fa4c2a9138e7/keystone-cron/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.325744 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1663e1b0-f9b0-4168-9386-abf2c1b56b43/kube-state-metrics/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.378874 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-644f48df4-b7jhq_e0a0a638-c445-4931-861e-d35704487c97/horizon-log/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.584952 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr_d50b5540-c2ca-4889-bbb0-3b5d04bc602f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.604603 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_35adb68e-2fb0-437c-bea7-e46f05e4918c/manila-api-log/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.800060 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_35adb68e-2fb0-437c-bea7-e46f05e4918c/manila-api/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.830424 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e9af702d-3a1a-490e-82f5-e99c1718ef83/probe/0.log" Mar 18 11:01:58 crc kubenswrapper[4778]: I0318 11:01:58.981072 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e9af702d-3a1a-490e-82f5-e99c1718ef83/manila-scheduler/0.log" Mar 18 11:01:59 crc kubenswrapper[4778]: I0318 11:01:59.102688 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_821dda0e-cde2-45a4-b23a-3d13565be515/probe/0.log" Mar 18 11:01:59 crc kubenswrapper[4778]: I0318 11:01:59.166517 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_821dda0e-cde2-45a4-b23a-3d13565be515/manila-share/0.log" Mar 18 11:01:59 crc kubenswrapper[4778]: I0318 11:01:59.909143 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v_52250b90-fbc6-418e-9a5f-4873d5fa5cd0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.141368 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563862-flgc8"] Mar 18 11:02:00 crc kubenswrapper[4778]: E0318 11:02:00.141821 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="registry-server" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.141842 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="registry-server" Mar 18 11:02:00 crc kubenswrapper[4778]: E0318 11:02:00.141856 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="extract-utilities" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.141862 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="extract-utilities" Mar 18 11:02:00 crc kubenswrapper[4778]: E0318 11:02:00.141872 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="extract-content" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.141878 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="extract-content" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.142159 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="abcddae8-6cd1-4a48-b133-af298a8fc9bb" containerName="registry-server" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.143447 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.146380 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.146665 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.146790 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.147521 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.147552 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.147583 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.148239 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.148306 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" gracePeriod=600 Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.161938 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-flgc8"] Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.251362 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x749h\" (UniqueName: \"kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h\") pod \"auto-csr-approver-29563862-flgc8\" (UID: \"0e9b6093-6849-4ee1-829c-3893c8efc355\") " pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:00 crc kubenswrapper[4778]: E0318 11:02:00.301061 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.352801 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x749h\" (UniqueName: \"kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h\") pod \"auto-csr-approver-29563862-flgc8\" (UID: \"0e9b6093-6849-4ee1-829c-3893c8efc355\") " pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.384273 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x749h\" (UniqueName: \"kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h\") pod \"auto-csr-approver-29563862-flgc8\" (UID: \"0e9b6093-6849-4ee1-829c-3893c8efc355\") " pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.434839 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d979499f7-4flxt_da263057-3652-4ae8-8435-4f80e4b13804/neutron-httpd/0.log" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.462919 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:00 crc kubenswrapper[4778]: I0318 11:02:00.979350 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-flgc8"] Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.158134 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-75996d8fd4-jhtd2_4c045639-00d0-4ba6-9d75-c67934521e29/keystone-api/0.log" Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.214182 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563862-flgc8" event={"ID":"0e9b6093-6849-4ee1-829c-3893c8efc355","Type":"ContainerStarted","Data":"147cc97a1f98cfc31f25273d021eb620b9bf14cdb68020dd46c294882b45318b"} Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.217426 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" exitCode=0 Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.217480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9"} Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.217804 4778 scope.go:117] "RemoveContainer" containerID="3a30fa73502544b7ae8a345c4023efae179b70a66379eb3e45aa01314f418265" Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.218623 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:02:01 crc kubenswrapper[4778]: E0318 11:02:01.218994 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.451060 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d979499f7-4flxt_da263057-3652-4ae8-8435-4f80e4b13804/neutron-api/0.log" Mar 18 11:02:01 crc kubenswrapper[4778]: I0318 11:02:01.991533 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9ba2a389-4009-4dab-bc75-45a574e50bbc/nova-cell1-conductor-conductor/0.log" Mar 18 11:02:02 crc kubenswrapper[4778]: I0318 11:02:02.184982 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3fc908a0-dc90-4df9-869c-5c0820cac423/nova-cell0-conductor-conductor/0.log" Mar 18 11:02:02 crc kubenswrapper[4778]: I0318 11:02:02.587241 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9549b39b-0fc5-4e89-b64a-de83c80735ed/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 11:02:02 crc kubenswrapper[4778]: I0318 11:02:02.765238 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9_b2db5491-57b4-427a-b306-5e525a1e7c27/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:03 crc kubenswrapper[4778]: I0318 11:02:03.071274 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28f01ca6-f7d2-4de3-9aa9-256803533b80/nova-metadata-log/0.log" Mar 18 11:02:03 crc kubenswrapper[4778]: I0318 11:02:03.254325 4778 generic.go:334] "Generic (PLEG): container finished" podID="0e9b6093-6849-4ee1-829c-3893c8efc355" containerID="3545c5320c999ca132cdbecfec3fe0adacffa5fbc99319fc9409f6ba39ed60ac" exitCode=0 Mar 18 11:02:03 crc kubenswrapper[4778]: I0318 11:02:03.254372 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563862-flgc8" event={"ID":"0e9b6093-6849-4ee1-829c-3893c8efc355","Type":"ContainerDied","Data":"3545c5320c999ca132cdbecfec3fe0adacffa5fbc99319fc9409f6ba39ed60ac"} Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.224190 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9b1623d1-2084-419e-b36a-80930113a280/nova-scheduler-scheduler/0.log" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.558172 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28f01ca6-f7d2-4de3-9aa9-256803533b80/nova-metadata-metadata/0.log" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.694533 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.714145 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/mysql-bootstrap/0.log" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.722965 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a702c51-b7a6-4094-9d34-519102e1cf91/nova-api-log/0.log" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.828388 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x749h\" (UniqueName: \"kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h\") pod \"0e9b6093-6849-4ee1-829c-3893c8efc355\" (UID: \"0e9b6093-6849-4ee1-829c-3893c8efc355\") " Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.838429 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h" (OuterVolumeSpecName: "kube-api-access-x749h") pod "0e9b6093-6849-4ee1-829c-3893c8efc355" (UID: "0e9b6093-6849-4ee1-829c-3893c8efc355"). InnerVolumeSpecName "kube-api-access-x749h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.932441 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x749h\" (UniqueName: \"kubernetes.io/projected/0e9b6093-6849-4ee1-829c-3893c8efc355-kube-api-access-x749h\") on node \"crc\" DevicePath \"\"" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.983402 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/mysql-bootstrap/0.log" Mar 18 11:02:04 crc kubenswrapper[4778]: I0318 11:02:04.984038 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/galera/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.171524 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/mysql-bootstrap/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.282922 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563862-flgc8" event={"ID":"0e9b6093-6849-4ee1-829c-3893c8efc355","Type":"ContainerDied","Data":"147cc97a1f98cfc31f25273d021eb620b9bf14cdb68020dd46c294882b45318b"} Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.282960 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="147cc97a1f98cfc31f25273d021eb620b9bf14cdb68020dd46c294882b45318b" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.282963 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-flgc8" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.400240 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/mysql-bootstrap/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.451552 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/galera/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.556997 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fec302c3-e5fc-4019-b4f5-50de6bdde59f/openstackclient/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.696494 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-djmq6_f58533cf-4c57-4c3a-b772-e2a488298d7e/ovn-controller/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.764052 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-kvmq4"] Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.770801 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a702c51-b7a6-4094-9d34-519102e1cf91/nova-api-api/0.log" Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.773905 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-kvmq4"] Mar 18 11:02:05 crc kubenswrapper[4778]: I0318 11:02:05.801918 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-2ldk7_2c6e8f7b-9b48-4814-9e73-fc9833c26cc9/openstack-network-exporter/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.026905 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server-init/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.196240 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b69a324-153a-4262-92ea-62c8b9d5928e" path="/var/lib/kubelet/pods/0b69a324-153a-4262-92ea-62c8b9d5928e/volumes" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.205406 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.206356 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server-init/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.213607 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovs-vswitchd/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.403474 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7jqhd_1f0f4177-ad12-4848-bbd7-39b004344cb3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.409358 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac3419bd-88ba-4b83-bd93-ad5638bc7fd0/openstack-network-exporter/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.472256 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac3419bd-88ba-4b83-bd93-ad5638bc7fd0/ovn-northd/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.612417 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_113a3fc7-40a1-46f9-b93f-01a34fcaf4aa/openstack-network-exporter/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.663913 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_113a3fc7-40a1-46f9-b93f-01a34fcaf4aa/ovsdbserver-nb/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.783209 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_495e34ad-2f4d-46de-95e9-37b34a35f2d2/openstack-network-exporter/0.log" Mar 18 11:02:06 crc kubenswrapper[4778]: I0318 11:02:06.853267 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_495e34ad-2f4d-46de-95e9-37b34a35f2d2/ovsdbserver-sb/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.106124 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/setup-container/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.326431 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/setup-container/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.336233 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/rabbitmq/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.424967 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7588d8786-t6x7l_fe0de426-6927-42ea-8b29-8bc01c27fe69/placement-api/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.577784 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/setup-container/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.647240 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7588d8786-t6x7l_fe0de426-6927-42ea-8b29-8bc01c27fe69/placement-log/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.755441 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/setup-container/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.819024 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/rabbitmq/0.log" Mar 18 11:02:07 crc kubenswrapper[4778]: I0318 11:02:07.873648 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7_613d0a31-a371-4c66-8254-85a7cc864fd0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.069535 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx_136dbfab-32f1-40ee-b685-74411fbc06ba/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.085341 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9bss8_80a8d263-9bba-4db0-928e-f633b4ad5314/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.257711 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j74ts_53b18647-af19-457c-9543-2156c1ace738/ssh-known-hosts-edpm-deployment/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.403511 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-full_757e3758-d646-4267-8c4c-b5efb0dcf709/tempest-tests-tempest-tests-runner/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.527792 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-test_c5a7a532-f8c2-4741-9892-65047a4cb225/tempest-tests-tempest-tests-runner/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.578154 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-ansibletest-ansibletest-ansibletest_1f57757d-6483-4e1a-9a09-e63026f73e70/test-operator-logs-container/0.log" Mar 18 11:02:08 crc kubenswrapper[4778]: I0318 11:02:08.718963 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-horizontest-horizontest-tests-horizontest_3db5e33d-384f-4df3-bfb8-ba279b83f7e4/test-operator-logs-container/0.log" Mar 18 11:02:09 crc kubenswrapper[4778]: I0318 11:02:09.014048 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_fb176b71-d782-4b0d-963f-94acef50cf11/test-operator-logs-container/0.log" Mar 18 11:02:09 crc kubenswrapper[4778]: I0318 11:02:09.128409 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tobiko-tobiko-tests-tobiko_4e028d5e-666c-497c-949e-97860410ad74/test-operator-logs-container/0.log" Mar 18 11:02:09 crc kubenswrapper[4778]: I0318 11:02:09.171969 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s00-podified-functional_5c0d8cb1-d7bc-4694-ac54-e0a9f8312557/tobiko-tests-tobiko/0.log" Mar 18 11:02:09 crc kubenswrapper[4778]: I0318 11:02:09.335883 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s01-sanity_bd565818-8912-47ba-881f-f88011fa9b46/tobiko-tests-tobiko/0.log" Mar 18 11:02:09 crc kubenswrapper[4778]: I0318 11:02:09.438426 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-44vc9_5e5ecb95-ba90-4f70-ae42-63e71026ffef/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:02:10 crc kubenswrapper[4778]: I0318 11:02:10.755160 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_fc50d224-cd65-4a46-b3d0-b40acdbda53d/memcached/0.log" Mar 18 11:02:14 crc kubenswrapper[4778]: I0318 11:02:14.197054 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:02:14 crc kubenswrapper[4778]: E0318 11:02:14.197808 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:28 crc kubenswrapper[4778]: I0318 11:02:28.186795 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:02:28 crc kubenswrapper[4778]: E0318 11:02:28.187693 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:33 crc kubenswrapper[4778]: I0318 11:02:33.418133 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-fsxlt_3390909b-6271-40dd-9662-0710f6866143/manager/0.log" Mar 18 11:02:33 crc kubenswrapper[4778]: I0318 11:02:33.656061 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-7mbx2_710ababb-0bee-441d-8dd0-e6a72ea2b2e3/manager/0.log" Mar 18 11:02:33 crc kubenswrapper[4778]: I0318 11:02:33.854176 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.100642 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.107341 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.176258 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.319539 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.325155 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.397698 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/extract/0.log" Mar 18 11:02:34 crc kubenswrapper[4778]: I0318 11:02:34.637806 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-wb4pc_b41dbd4a-33dd-4dca-9356-34c740e8063f/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.025407 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-t5c4w_aceb2f7b-585f-451a-83b8-e673965ada87/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.077330 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-wxftc_0526f654-9ddc-4495-bb04-be13e53b6a1b/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.126925 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-x7rnp_124dc549-cb2a-4b1c-a610-093cf9b8c05d/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.297155 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-fjjvl_3c86f76c-1617-45e9-9573-f6fd51803b45/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.567232 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-5xvtc_e1ec7bae-8e15-4844-84d2-ff5951d0be31/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.578963 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-64c4x_66d3bf3a-086c-4340-ba73-209f526fc33c/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.648097 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-zpc92_211c991a-9406-4360-aa7f-830be3aa55db/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.782789 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-47sbc_37675366-70a8-4e0b-b92b-f7055547d918/manager/0.log" Mar 18 11:02:35 crc kubenswrapper[4778]: I0318 11:02:35.875011 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-k4r2p_ae690990-eeb1-4871-8c51-dd3b547e1193/manager/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.071621 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-pzjdt_c776af1e-ad54-40fe-9bed-a0a09ce0eea7/manager/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.084739 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-h6whs_e245908e-e35e-403c-93f6-48371904ae42/manager/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.248153 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-xdgmv_80822932-2943-4f81-9436-1553ed031359/manager/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.378314 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-654f4fc7f7-9d4pb_b8267dff-2541-481e-bc64-13eb8d19300b/operator/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.582919 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-v7qxm_c508c810-232f-48c1-8d15-bbbb118d2948/registry-server/0.log" Mar 18 11:02:36 crc kubenswrapper[4778]: I0318 11:02:36.848716 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-fgfk9_208b26f2-3c91-4966-9d01-8fe73e4a7d87/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.007840 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-d5w9q_2f8e8860-00a1-43fc-9776-c617f270cc50/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.107679 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5jrv8_b837636e-8c09-42b7-9a81-e7875df68344/operator/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.326299 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-c6l5k_8ccabb3b-da59-4ab0-89c8-99094a939f0d/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.542379 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-tx9zq_9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.715733 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-54c5f5bc8-jsm76_99adb6be-2a3e-4148-8074-9258222ebd60/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.790499 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f5c7df4d7-m4kvr_3c7e3158-5139-467d-b33c-808747f0d9be/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.862336 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-sgs49_57277339-c9be-4de1-8e35-72ae98d33905/manager/0.log" Mar 18 11:02:37 crc kubenswrapper[4778]: I0318 11:02:37.911687 4778 scope.go:117] "RemoveContainer" containerID="82c47033c6d17fb0d1f1f077c5ae48584be4ec251f8c624e7bed8591ae05dffd" Mar 18 11:02:40 crc kubenswrapper[4778]: I0318 11:02:40.188061 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:02:40 crc kubenswrapper[4778]: E0318 11:02:40.188999 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:51 crc kubenswrapper[4778]: I0318 11:02:51.186855 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:02:51 crc kubenswrapper[4778]: E0318 11:02:51.187741 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:02:56 crc kubenswrapper[4778]: I0318 11:02:56.210986 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qtggn_ba84f396-0169-4d5e-a126-60ac9d6d49f8/control-plane-machine-set-operator/0.log" Mar 18 11:02:56 crc kubenswrapper[4778]: I0318 11:02:56.408401 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-smtz9_f06790e0-cf8c-48f0-8d48-893663fdbd1c/kube-rbac-proxy/0.log" Mar 18 11:02:56 crc kubenswrapper[4778]: I0318 11:02:56.419960 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-smtz9_f06790e0-cf8c-48f0-8d48-893663fdbd1c/machine-api-operator/0.log" Mar 18 11:03:03 crc kubenswrapper[4778]: I0318 11:03:03.188034 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:03:03 crc kubenswrapper[4778]: E0318 11:03:03.191586 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:03:03 crc kubenswrapper[4778]: I0318 11:03:03.452604 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:03 crc kubenswrapper[4778]: E0318 11:03:03.453217 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9b6093-6849-4ee1-829c-3893c8efc355" containerName="oc" Mar 18 11:03:03 crc kubenswrapper[4778]: I0318 11:03:03.453244 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9b6093-6849-4ee1-829c-3893c8efc355" containerName="oc" Mar 18 11:03:03 crc kubenswrapper[4778]: I0318 11:03:03.453599 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9b6093-6849-4ee1-829c-3893c8efc355" containerName="oc" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.455855 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.466924 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.608373 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.608646 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.609598 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tsld\" (UniqueName: \"kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.711333 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.711489 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tsld\" (UniqueName: \"kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.711545 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.711882 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.711965 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.733997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tsld\" (UniqueName: \"kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld\") pod \"community-operators-sdl4q\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:03.788020 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:04.344494 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:04.840709 4778 generic.go:334] "Generic (PLEG): container finished" podID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerID="8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917" exitCode=0 Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:04.840821 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerDied","Data":"8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917"} Mar 18 11:03:04 crc kubenswrapper[4778]: I0318 11:03:04.841230 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerStarted","Data":"b36d40a379b90ece19ca99233599298bef161851bf4285fc5a21ee921ddbd7a9"} Mar 18 11:03:05 crc kubenswrapper[4778]: I0318 11:03:05.850316 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerStarted","Data":"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205"} Mar 18 11:03:06 crc kubenswrapper[4778]: I0318 11:03:06.860330 4778 generic.go:334] "Generic (PLEG): container finished" podID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerID="97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205" exitCode=0 Mar 18 11:03:06 crc kubenswrapper[4778]: I0318 11:03:06.860436 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerDied","Data":"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205"} Mar 18 11:03:09 crc kubenswrapper[4778]: I0318 11:03:09.910426 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-qrqw4_e39be52c-c244-44cc-a707-0ec9994991fa/cert-manager-controller/0.log" Mar 18 11:03:10 crc kubenswrapper[4778]: I0318 11:03:10.031115 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-khqrg_24a88e8d-e986-4b3d-a77e-1a3e5162ac9c/cert-manager-cainjector/0.log" Mar 18 11:03:10 crc kubenswrapper[4778]: I0318 11:03:10.116487 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-hjskg_f09bc4b7-d305-4674-8540-283bd0b4901c/cert-manager-webhook/0.log" Mar 18 11:03:12 crc kubenswrapper[4778]: I0318 11:03:12.919747 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerStarted","Data":"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c"} Mar 18 11:03:12 crc kubenswrapper[4778]: I0318 11:03:12.942224 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sdl4q" podStartSLOduration=2.366019728 podStartE2EDuration="9.942188711s" podCreationTimestamp="2026-03-18 11:03:03 +0000 UTC" firstStartedPulling="2026-03-18 11:03:04.842712569 +0000 UTC m=+7251.417457399" lastFinishedPulling="2026-03-18 11:03:12.418881532 +0000 UTC m=+7258.993626382" observedRunningTime="2026-03-18 11:03:12.93440427 +0000 UTC m=+7259.509149180" watchObservedRunningTime="2026-03-18 11:03:12.942188711 +0000 UTC m=+7259.516933551" Mar 18 11:03:13 crc kubenswrapper[4778]: I0318 11:03:13.788933 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:13 crc kubenswrapper[4778]: I0318 11:03:13.789382 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:14 crc kubenswrapper[4778]: I0318 11:03:14.837305 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-sdl4q" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="registry-server" probeResult="failure" output=< Mar 18 11:03:14 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 11:03:14 crc kubenswrapper[4778]: > Mar 18 11:03:15 crc kubenswrapper[4778]: E0318 11:03:15.188016 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:03:16 crc kubenswrapper[4778]: I0318 11:03:16.187798 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:03:16 crc kubenswrapper[4778]: E0318 11:03:16.188124 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.137736 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-22c9p_8b636ef7-4b85-4506-bb2a-f89bee9b028d/nmstate-console-plugin/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.293535 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5thsf_5b97fa25-4d3d-4664-a5fc-41c98bbd272f/nmstate-handler/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.316211 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wq8gr_71b50b27-6084-4693-acbc-d14f36759618/kube-rbac-proxy/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.325812 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wq8gr_71b50b27-6084-4693-acbc-d14f36759618/nmstate-metrics/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.480175 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-sr9ls_1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe/nmstate-operator/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.524647 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-thw7f_5961b98d-a41a-4ceb-bb71-4bf3a0fc854d/nmstate-webhook/0.log" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.837338 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:23 crc kubenswrapper[4778]: I0318 11:03:23.887356 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:24 crc kubenswrapper[4778]: I0318 11:03:24.074830 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.027633 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sdl4q" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="registry-server" containerID="cri-o://0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c" gracePeriod=2 Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.544783 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.666959 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tsld\" (UniqueName: \"kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld\") pod \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.667054 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content\") pod \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.667317 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities\") pod \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\" (UID: \"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898\") " Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.667925 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities" (OuterVolumeSpecName: "utilities") pod "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" (UID: "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.678373 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld" (OuterVolumeSpecName: "kube-api-access-8tsld") pod "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" (UID: "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898"). InnerVolumeSpecName "kube-api-access-8tsld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.726662 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" (UID: "6a9d0b8e-5cb6-43f2-87ce-09acac1ff898"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.769631 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.769665 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tsld\" (UniqueName: \"kubernetes.io/projected/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-kube-api-access-8tsld\") on node \"crc\" DevicePath \"\"" Mar 18 11:03:25 crc kubenswrapper[4778]: I0318 11:03:25.769692 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.048616 4778 generic.go:334] "Generic (PLEG): container finished" podID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerID="0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c" exitCode=0 Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.048666 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerDied","Data":"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c"} Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.048698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdl4q" event={"ID":"6a9d0b8e-5cb6-43f2-87ce-09acac1ff898","Type":"ContainerDied","Data":"b36d40a379b90ece19ca99233599298bef161851bf4285fc5a21ee921ddbd7a9"} Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.048722 4778 scope.go:117] "RemoveContainer" containerID="0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.048731 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdl4q" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.070999 4778 scope.go:117] "RemoveContainer" containerID="97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.102242 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.124024 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sdl4q"] Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.132495 4778 scope.go:117] "RemoveContainer" containerID="8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.175645 4778 scope.go:117] "RemoveContainer" containerID="0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c" Mar 18 11:03:26 crc kubenswrapper[4778]: E0318 11:03:26.176155 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c\": container with ID starting with 0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c not found: ID does not exist" containerID="0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.176255 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c"} err="failed to get container status \"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c\": rpc error: code = NotFound desc = could not find container \"0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c\": container with ID starting with 0f75e38733cf9508311024d8b8f5f44983a2a31d72e844e8f39db1cf03007a3c not found: ID does not exist" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.176291 4778 scope.go:117] "RemoveContainer" containerID="97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205" Mar 18 11:03:26 crc kubenswrapper[4778]: E0318 11:03:26.176798 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205\": container with ID starting with 97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205 not found: ID does not exist" containerID="97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.176834 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205"} err="failed to get container status \"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205\": rpc error: code = NotFound desc = could not find container \"97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205\": container with ID starting with 97ccf33ab02fe57f3395b49a1256327bc87fc922812df8dc59fe699275b7e205 not found: ID does not exist" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.176860 4778 scope.go:117] "RemoveContainer" containerID="8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917" Mar 18 11:03:26 crc kubenswrapper[4778]: E0318 11:03:26.177273 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917\": container with ID starting with 8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917 not found: ID does not exist" containerID="8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.177294 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917"} err="failed to get container status \"8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917\": rpc error: code = NotFound desc = could not find container \"8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917\": container with ID starting with 8788ccd3f209681f25926ae25879d3515f3eeceb29aa0ebf20026745bae1e917 not found: ID does not exist" Mar 18 11:03:26 crc kubenswrapper[4778]: I0318 11:03:26.198529 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" path="/var/lib/kubelet/pods/6a9d0b8e-5cb6-43f2-87ce-09acac1ff898/volumes" Mar 18 11:03:28 crc kubenswrapper[4778]: I0318 11:03:28.186630 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:03:28 crc kubenswrapper[4778]: E0318 11:03:28.187170 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:03:41 crc kubenswrapper[4778]: I0318 11:03:41.186622 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:03:41 crc kubenswrapper[4778]: E0318 11:03:41.187381 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:03:52 crc kubenswrapper[4778]: I0318 11:03:52.602235 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-sv9kd_1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c/kube-rbac-proxy/0.log" Mar 18 11:03:52 crc kubenswrapper[4778]: I0318 11:03:52.778709 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-sv9kd_1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c/controller/0.log" Mar 18 11:03:52 crc kubenswrapper[4778]: I0318 11:03:52.841166 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.004325 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.006689 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.058763 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.125734 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.187627 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:03:53 crc kubenswrapper[4778]: E0318 11:03:53.188060 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.272341 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.281569 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.291753 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.317364 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.485359 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.514681 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.537076 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/controller/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.554139 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.714542 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/frr-metrics/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.754875 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/kube-rbac-proxy-frr/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.759007 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/kube-rbac-proxy/0.log" Mar 18 11:03:53 crc kubenswrapper[4778]: I0318 11:03:53.989184 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/reloader/0.log" Mar 18 11:03:54 crc kubenswrapper[4778]: I0318 11:03:54.013648 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jrtjv_0f18e9f0-b3eb-440a-b035-ed8256df5ed9/frr-k8s-webhook-server/0.log" Mar 18 11:03:54 crc kubenswrapper[4778]: I0318 11:03:54.402301 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78856dcdc4-9cltx_721ee07f-fded-43ab-9bb7-2e4e56c98515/manager/0.log" Mar 18 11:03:54 crc kubenswrapper[4778]: I0318 11:03:54.586917 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b499db45c-c5tcr_75885bb8-adce-4801-8941-75042ab330ea/webhook-server/0.log" Mar 18 11:03:54 crc kubenswrapper[4778]: I0318 11:03:54.661181 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wd69x_1c97662e-d673-42c1-a6ad-75865ba2b8b6/kube-rbac-proxy/0.log" Mar 18 11:03:55 crc kubenswrapper[4778]: I0318 11:03:55.256645 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wd69x_1c97662e-d673-42c1-a6ad-75865ba2b8b6/speaker/0.log" Mar 18 11:03:55 crc kubenswrapper[4778]: I0318 11:03:55.955989 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/frr/0.log" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.169613 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563864-np9td"] Mar 18 11:04:00 crc kubenswrapper[4778]: E0318 11:04:00.170770 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="extract-content" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.170792 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="extract-content" Mar 18 11:04:00 crc kubenswrapper[4778]: E0318 11:04:00.170842 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="registry-server" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.170854 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="registry-server" Mar 18 11:04:00 crc kubenswrapper[4778]: E0318 11:04:00.170884 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="extract-utilities" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.170896 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="extract-utilities" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.171247 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9d0b8e-5cb6-43f2-87ce-09acac1ff898" containerName="registry-server" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.172016 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.177710 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.178544 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.178549 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.184879 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-np9td"] Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.266697 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfp29\" (UniqueName: \"kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29\") pod \"auto-csr-approver-29563864-np9td\" (UID: \"93ef8df9-98a4-4897-918d-b573fc50f7bb\") " pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.368221 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfp29\" (UniqueName: \"kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29\") pod \"auto-csr-approver-29563864-np9td\" (UID: \"93ef8df9-98a4-4897-918d-b573fc50f7bb\") " pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.392814 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfp29\" (UniqueName: \"kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29\") pod \"auto-csr-approver-29563864-np9td\" (UID: \"93ef8df9-98a4-4897-918d-b573fc50f7bb\") " pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.510734 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:00 crc kubenswrapper[4778]: I0318 11:04:00.983328 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-np9td"] Mar 18 11:04:01 crc kubenswrapper[4778]: I0318 11:04:01.420525 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563864-np9td" event={"ID":"93ef8df9-98a4-4897-918d-b573fc50f7bb","Type":"ContainerStarted","Data":"0d92a478b867c5082f49cd2e833809c8ef4e6522acb91242c972c6f922ac9da2"} Mar 18 11:04:02 crc kubenswrapper[4778]: I0318 11:04:02.429114 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563864-np9td" event={"ID":"93ef8df9-98a4-4897-918d-b573fc50f7bb","Type":"ContainerStarted","Data":"806be4b53bf081091f4914e3382dec40c771f43c71215c8124342f2c0296cb37"} Mar 18 11:04:03 crc kubenswrapper[4778]: I0318 11:04:03.439428 4778 generic.go:334] "Generic (PLEG): container finished" podID="93ef8df9-98a4-4897-918d-b573fc50f7bb" containerID="806be4b53bf081091f4914e3382dec40c771f43c71215c8124342f2c0296cb37" exitCode=0 Mar 18 11:04:03 crc kubenswrapper[4778]: I0318 11:04:03.439534 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563864-np9td" event={"ID":"93ef8df9-98a4-4897-918d-b573fc50f7bb","Type":"ContainerDied","Data":"806be4b53bf081091f4914e3382dec40c771f43c71215c8124342f2c0296cb37"} Mar 18 11:04:04 crc kubenswrapper[4778]: I0318 11:04:04.783826 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:04 crc kubenswrapper[4778]: I0318 11:04:04.873873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfp29\" (UniqueName: \"kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29\") pod \"93ef8df9-98a4-4897-918d-b573fc50f7bb\" (UID: \"93ef8df9-98a4-4897-918d-b573fc50f7bb\") " Mar 18 11:04:04 crc kubenswrapper[4778]: I0318 11:04:04.885489 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29" (OuterVolumeSpecName: "kube-api-access-tfp29") pod "93ef8df9-98a4-4897-918d-b573fc50f7bb" (UID: "93ef8df9-98a4-4897-918d-b573fc50f7bb"). InnerVolumeSpecName "kube-api-access-tfp29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:04:04 crc kubenswrapper[4778]: I0318 11:04:04.976933 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfp29\" (UniqueName: \"kubernetes.io/projected/93ef8df9-98a4-4897-918d-b573fc50f7bb-kube-api-access-tfp29\") on node \"crc\" DevicePath \"\"" Mar 18 11:04:05 crc kubenswrapper[4778]: I0318 11:04:05.463489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563864-np9td" event={"ID":"93ef8df9-98a4-4897-918d-b573fc50f7bb","Type":"ContainerDied","Data":"0d92a478b867c5082f49cd2e833809c8ef4e6522acb91242c972c6f922ac9da2"} Mar 18 11:04:05 crc kubenswrapper[4778]: I0318 11:04:05.463530 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d92a478b867c5082f49cd2e833809c8ef4e6522acb91242c972c6f922ac9da2" Mar 18 11:04:05 crc kubenswrapper[4778]: I0318 11:04:05.463565 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-np9td" Mar 18 11:04:05 crc kubenswrapper[4778]: I0318 11:04:05.880309 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-mb4zj"] Mar 18 11:04:05 crc kubenswrapper[4778]: I0318 11:04:05.891485 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-mb4zj"] Mar 18 11:04:06 crc kubenswrapper[4778]: I0318 11:04:06.225872 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e4f7f22-f4dd-4291-b26b-1a54380c3851" path="/var/lib/kubelet/pods/9e4f7f22-f4dd-4291-b26b-1a54380c3851/volumes" Mar 18 11:04:07 crc kubenswrapper[4778]: I0318 11:04:07.188295 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:04:07 crc kubenswrapper[4778]: E0318 11:04:07.188701 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.090145 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.323396 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.366498 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.387314 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.535219 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/extract/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.560789 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.599604 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.701448 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.884458 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.893135 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:04:09 crc kubenswrapper[4778]: I0318 11:04:09.915039 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.076785 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/extract/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.079049 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.093725 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.248574 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.409133 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.425892 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.425902 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.664019 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.741544 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:04:10 crc kubenswrapper[4778]: I0318 11:04:10.939492 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.233191 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.273374 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.326967 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.518027 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.550403 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.734604 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/registry-server/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.776889 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jj774_e037e8cd-1543-49a8-9389-4cc6f440c4b3/marketplace-operator/0.log" Mar 18 11:04:11 crc kubenswrapper[4778]: I0318 11:04:11.940823 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.207787 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.292414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.298627 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.368101 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/registry-server/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.504603 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.533299 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.740594 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.821840 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/registry-server/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.949158 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.949239 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:04:12 crc kubenswrapper[4778]: I0318 11:04:12.949162 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:04:13 crc kubenswrapper[4778]: I0318 11:04:13.160947 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:04:13 crc kubenswrapper[4778]: I0318 11:04:13.198966 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:04:13 crc kubenswrapper[4778]: I0318 11:04:13.996901 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/registry-server/0.log" Mar 18 11:04:22 crc kubenswrapper[4778]: I0318 11:04:22.187428 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:04:22 crc kubenswrapper[4778]: E0318 11:04:22.188323 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:04:35 crc kubenswrapper[4778]: I0318 11:04:35.187381 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:04:35 crc kubenswrapper[4778]: E0318 11:04:35.187406 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:04:35 crc kubenswrapper[4778]: E0318 11:04:35.188302 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:04:38 crc kubenswrapper[4778]: I0318 11:04:38.053738 4778 scope.go:117] "RemoveContainer" containerID="301c31c55dd167d2c6c06a6c3d13b7a706f6ed65cd7e2a490dde753952b7fad3" Mar 18 11:04:47 crc kubenswrapper[4778]: I0318 11:04:47.187902 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:04:47 crc kubenswrapper[4778]: E0318 11:04:47.188799 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:04:58 crc kubenswrapper[4778]: I0318 11:04:58.191750 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:04:58 crc kubenswrapper[4778]: E0318 11:04:58.192936 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:05:09 crc kubenswrapper[4778]: I0318 11:05:09.187806 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:05:09 crc kubenswrapper[4778]: E0318 11:05:09.188561 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:05:24 crc kubenswrapper[4778]: I0318 11:05:24.207535 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:05:24 crc kubenswrapper[4778]: E0318 11:05:24.208880 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:05:36 crc kubenswrapper[4778]: E0318 11:05:36.186924 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:05:39 crc kubenswrapper[4778]: I0318 11:05:39.188821 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:05:39 crc kubenswrapper[4778]: E0318 11:05:39.189601 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:05:52 crc kubenswrapper[4778]: I0318 11:05:52.188245 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:05:52 crc kubenswrapper[4778]: E0318 11:05:52.189214 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.170857 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563866-8zx72"] Mar 18 11:06:00 crc kubenswrapper[4778]: E0318 11:06:00.171764 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ef8df9-98a4-4897-918d-b573fc50f7bb" containerName="oc" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.171776 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ef8df9-98a4-4897-918d-b573fc50f7bb" containerName="oc" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.171931 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ef8df9-98a4-4897-918d-b573fc50f7bb" containerName="oc" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.172595 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.178483 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.178826 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.179139 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.209618 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563866-8zx72"] Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.262342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qvm2\" (UniqueName: \"kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2\") pod \"auto-csr-approver-29563866-8zx72\" (UID: \"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c\") " pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.364528 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qvm2\" (UniqueName: \"kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2\") pod \"auto-csr-approver-29563866-8zx72\" (UID: \"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c\") " pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.395983 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qvm2\" (UniqueName: \"kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2\") pod \"auto-csr-approver-29563866-8zx72\" (UID: \"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c\") " pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.506621 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:00 crc kubenswrapper[4778]: I0318 11:06:00.994456 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563866-8zx72"] Mar 18 11:06:00 crc kubenswrapper[4778]: W0318 11:06:00.996455 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b734773_4a1f_4acd_80e9_e3cd0cf14c2c.slice/crio-aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34 WatchSource:0}: Error finding container aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34: Status 404 returned error can't find the container with id aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34 Mar 18 11:06:01 crc kubenswrapper[4778]: I0318 11:06:01.638223 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563866-8zx72" event={"ID":"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c","Type":"ContainerStarted","Data":"aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34"} Mar 18 11:06:02 crc kubenswrapper[4778]: I0318 11:06:02.656953 4778 generic.go:334] "Generic (PLEG): container finished" podID="6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" containerID="9ce4ad858c60f25a18c86f0360777510f04c706cb5eafb4da8787fc9df1829e5" exitCode=0 Mar 18 11:06:02 crc kubenswrapper[4778]: I0318 11:06:02.657031 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563866-8zx72" event={"ID":"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c","Type":"ContainerDied","Data":"9ce4ad858c60f25a18c86f0360777510f04c706cb5eafb4da8787fc9df1829e5"} Mar 18 11:06:03 crc kubenswrapper[4778]: I0318 11:06:03.189368 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:06:03 crc kubenswrapper[4778]: E0318 11:06:03.189802 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.011019 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.177735 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qvm2\" (UniqueName: \"kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2\") pod \"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c\" (UID: \"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c\") " Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.183564 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2" (OuterVolumeSpecName: "kube-api-access-9qvm2") pod "6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" (UID: "6b734773-4a1f-4acd-80e9-e3cd0cf14c2c"). InnerVolumeSpecName "kube-api-access-9qvm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.283429 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qvm2\" (UniqueName: \"kubernetes.io/projected/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c-kube-api-access-9qvm2\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.676559 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563866-8zx72" event={"ID":"6b734773-4a1f-4acd-80e9-e3cd0cf14c2c","Type":"ContainerDied","Data":"aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34"} Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.676635 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed187c552143de0bae81a1d26ef671cf9348f4b261a733568bda00e48b58d34" Mar 18 11:06:04 crc kubenswrapper[4778]: I0318 11:06:04.676752 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-8zx72" Mar 18 11:06:05 crc kubenswrapper[4778]: I0318 11:06:05.106981 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-9p79f"] Mar 18 11:06:05 crc kubenswrapper[4778]: I0318 11:06:05.116482 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-9p79f"] Mar 18 11:06:06 crc kubenswrapper[4778]: I0318 11:06:06.200000 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bbe37de-66b2-4c42-a72f-92155eb2edb9" path="/var/lib/kubelet/pods/9bbe37de-66b2-4c42-a72f-92155eb2edb9/volumes" Mar 18 11:06:18 crc kubenswrapper[4778]: I0318 11:06:18.188014 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:06:18 crc kubenswrapper[4778]: E0318 11:06:18.188931 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:06:29 crc kubenswrapper[4778]: I0318 11:06:29.187245 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:06:29 crc kubenswrapper[4778]: E0318 11:06:29.188006 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.475090 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:30 crc kubenswrapper[4778]: E0318 11:06:30.476091 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" containerName="oc" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.476112 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" containerName="oc" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.476501 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" containerName="oc" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.481612 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.506047 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.589269 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.589503 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9j59\" (UniqueName: \"kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.589748 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.691456 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.691650 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.691723 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9j59\" (UniqueName: \"kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.692099 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.692227 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.715210 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9j59\" (UniqueName: \"kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59\") pod \"redhat-marketplace-k4rg6\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:30 crc kubenswrapper[4778]: I0318 11:06:30.822526 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:31 crc kubenswrapper[4778]: W0318 11:06:31.327562 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc748da1f_65e2_4349_86f3_bfcc90cc7d1c.slice/crio-49bfb1e3c7a826c25e45fd281397eec07f7f2e8c7d44afd520b898a0aed12985 WatchSource:0}: Error finding container 49bfb1e3c7a826c25e45fd281397eec07f7f2e8c7d44afd520b898a0aed12985: Status 404 returned error can't find the container with id 49bfb1e3c7a826c25e45fd281397eec07f7f2e8c7d44afd520b898a0aed12985 Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.349256 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.973551 4778 generic.go:334] "Generic (PLEG): container finished" podID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerID="e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3" exitCode=0 Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.973883 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerDied","Data":"e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3"} Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.974057 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerStarted","Data":"49bfb1e3c7a826c25e45fd281397eec07f7f2e8c7d44afd520b898a0aed12985"} Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.976870 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.978349 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c2e2094-7c48-4653-8b53-95483d470344" containerID="8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa" exitCode=0 Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.978416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" event={"ID":"2c2e2094-7c48-4653-8b53-95483d470344","Type":"ContainerDied","Data":"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa"} Mar 18 11:06:31 crc kubenswrapper[4778]: I0318 11:06:31.979306 4778 scope.go:117] "RemoveContainer" containerID="8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa" Mar 18 11:06:32 crc kubenswrapper[4778]: I0318 11:06:32.609445 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n6w9k_must-gather-5mjwn_2c2e2094-7c48-4653-8b53-95483d470344/gather/0.log" Mar 18 11:06:34 crc kubenswrapper[4778]: I0318 11:06:34.011215 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerStarted","Data":"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860"} Mar 18 11:06:35 crc kubenswrapper[4778]: I0318 11:06:35.031880 4778 generic.go:334] "Generic (PLEG): container finished" podID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerID="7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860" exitCode=0 Mar 18 11:06:35 crc kubenswrapper[4778]: I0318 11:06:35.031970 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerDied","Data":"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860"} Mar 18 11:06:36 crc kubenswrapper[4778]: I0318 11:06:36.048002 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerStarted","Data":"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328"} Mar 18 11:06:36 crc kubenswrapper[4778]: I0318 11:06:36.086706 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k4rg6" podStartSLOduration=2.505150333 podStartE2EDuration="6.086685151s" podCreationTimestamp="2026-03-18 11:06:30 +0000 UTC" firstStartedPulling="2026-03-18 11:06:31.976429128 +0000 UTC m=+7458.551173998" lastFinishedPulling="2026-03-18 11:06:35.557963966 +0000 UTC m=+7462.132708816" observedRunningTime="2026-03-18 11:06:36.072044474 +0000 UTC m=+7462.646789334" watchObservedRunningTime="2026-03-18 11:06:36.086685151 +0000 UTC m=+7462.661430001" Mar 18 11:06:38 crc kubenswrapper[4778]: I0318 11:06:38.170360 4778 scope.go:117] "RemoveContainer" containerID="1bb8821bd2c18d4bb5e7f9c4c0784d606dc27180e5e74bcaf381cd0d404e43fd" Mar 18 11:06:38 crc kubenswrapper[4778]: I0318 11:06:38.224535 4778 scope.go:117] "RemoveContainer" containerID="3fd6d1f0823c046b310eb8beb463813250215ced92cb6e72ad8093250484ff70" Mar 18 11:06:40 crc kubenswrapper[4778]: I0318 11:06:40.822678 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:40 crc kubenswrapper[4778]: I0318 11:06:40.823073 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:40 crc kubenswrapper[4778]: I0318 11:06:40.877011 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.013019 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6w9k/must-gather-5mjwn"] Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.013333 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="copy" containerID="cri-o://e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec" gracePeriod=2 Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.026732 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6w9k/must-gather-5mjwn"] Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.168370 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.218356 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.506146 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n6w9k_must-gather-5mjwn_2c2e2094-7c48-4653-8b53-95483d470344/copy/0.log" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.506706 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.680306 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zvb6\" (UniqueName: \"kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6\") pod \"2c2e2094-7c48-4653-8b53-95483d470344\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.680362 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output\") pod \"2c2e2094-7c48-4653-8b53-95483d470344\" (UID: \"2c2e2094-7c48-4653-8b53-95483d470344\") " Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.690970 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6" (OuterVolumeSpecName: "kube-api-access-6zvb6") pod "2c2e2094-7c48-4653-8b53-95483d470344" (UID: "2c2e2094-7c48-4653-8b53-95483d470344"). InnerVolumeSpecName "kube-api-access-6zvb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.782702 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zvb6\" (UniqueName: \"kubernetes.io/projected/2c2e2094-7c48-4653-8b53-95483d470344-kube-api-access-6zvb6\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.880743 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2c2e2094-7c48-4653-8b53-95483d470344" (UID: "2c2e2094-7c48-4653-8b53-95483d470344"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:06:41 crc kubenswrapper[4778]: I0318 11:06:41.885180 4778 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c2e2094-7c48-4653-8b53-95483d470344-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.113121 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n6w9k_must-gather-5mjwn_2c2e2094-7c48-4653-8b53-95483d470344/copy/0.log" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.115807 4778 generic.go:334] "Generic (PLEG): container finished" podID="2c2e2094-7c48-4653-8b53-95483d470344" containerID="e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec" exitCode=143 Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.115906 4778 scope.go:117] "RemoveContainer" containerID="e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.115867 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6w9k/must-gather-5mjwn" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.148437 4778 scope.go:117] "RemoveContainer" containerID="8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.200611 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2e2094-7c48-4653-8b53-95483d470344" path="/var/lib/kubelet/pods/2c2e2094-7c48-4653-8b53-95483d470344/volumes" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.257272 4778 scope.go:117] "RemoveContainer" containerID="e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec" Mar 18 11:06:42 crc kubenswrapper[4778]: E0318 11:06:42.258330 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec\": container with ID starting with e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec not found: ID does not exist" containerID="e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.258366 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec"} err="failed to get container status \"e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec\": rpc error: code = NotFound desc = could not find container \"e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec\": container with ID starting with e84a365d7048286a12c936f158d969a4ed07b93d0ced0643d0419cb198df5cec not found: ID does not exist" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.258388 4778 scope.go:117] "RemoveContainer" containerID="8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa" Mar 18 11:06:42 crc kubenswrapper[4778]: E0318 11:06:42.258707 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa\": container with ID starting with 8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa not found: ID does not exist" containerID="8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa" Mar 18 11:06:42 crc kubenswrapper[4778]: I0318 11:06:42.258726 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa"} err="failed to get container status \"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa\": rpc error: code = NotFound desc = could not find container \"8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa\": container with ID starting with 8686611a27bab224c0258ff8e66091ce7b2ff3521e72e7159bd626e39a154eaa not found: ID does not exist" Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.129371 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k4rg6" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="registry-server" containerID="cri-o://dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328" gracePeriod=2 Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.745181 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.926975 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content\") pod \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.927094 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities\") pod \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.927265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9j59\" (UniqueName: \"kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59\") pod \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\" (UID: \"c748da1f-65e2-4349-86f3-bfcc90cc7d1c\") " Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.928127 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities" (OuterVolumeSpecName: "utilities") pod "c748da1f-65e2-4349-86f3-bfcc90cc7d1c" (UID: "c748da1f-65e2-4349-86f3-bfcc90cc7d1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.942340 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59" (OuterVolumeSpecName: "kube-api-access-d9j59") pod "c748da1f-65e2-4349-86f3-bfcc90cc7d1c" (UID: "c748da1f-65e2-4349-86f3-bfcc90cc7d1c"). InnerVolumeSpecName "kube-api-access-d9j59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:06:43 crc kubenswrapper[4778]: I0318 11:06:43.962389 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c748da1f-65e2-4349-86f3-bfcc90cc7d1c" (UID: "c748da1f-65e2-4349-86f3-bfcc90cc7d1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.029938 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9j59\" (UniqueName: \"kubernetes.io/projected/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-kube-api-access-d9j59\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.029978 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.029988 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c748da1f-65e2-4349-86f3-bfcc90cc7d1c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.146554 4778 generic.go:334] "Generic (PLEG): container finished" podID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerID="dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328" exitCode=0 Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.146602 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerDied","Data":"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328"} Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.146628 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4rg6" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.146642 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4rg6" event={"ID":"c748da1f-65e2-4349-86f3-bfcc90cc7d1c","Type":"ContainerDied","Data":"49bfb1e3c7a826c25e45fd281397eec07f7f2e8c7d44afd520b898a0aed12985"} Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.146718 4778 scope.go:117] "RemoveContainer" containerID="dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.173423 4778 scope.go:117] "RemoveContainer" containerID="7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.198111 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:06:44 crc kubenswrapper[4778]: E0318 11:06:44.198467 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.203595 4778 scope.go:117] "RemoveContainer" containerID="e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.218092 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.218143 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4rg6"] Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.260859 4778 scope.go:117] "RemoveContainer" containerID="dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328" Mar 18 11:06:44 crc kubenswrapper[4778]: E0318 11:06:44.261372 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328\": container with ID starting with dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328 not found: ID does not exist" containerID="dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.261427 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328"} err="failed to get container status \"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328\": rpc error: code = NotFound desc = could not find container \"dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328\": container with ID starting with dda12fa1ba2f30ff51455b457a44d6116d5bd481ece66d7eb371f0e91bf0d328 not found: ID does not exist" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.261458 4778 scope.go:117] "RemoveContainer" containerID="7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860" Mar 18 11:06:44 crc kubenswrapper[4778]: E0318 11:06:44.261762 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860\": container with ID starting with 7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860 not found: ID does not exist" containerID="7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.261794 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860"} err="failed to get container status \"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860\": rpc error: code = NotFound desc = could not find container \"7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860\": container with ID starting with 7162033ee1419e13255237febc7c37c15b21d0ea8202e633b3d4cda10f13e860 not found: ID does not exist" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.261809 4778 scope.go:117] "RemoveContainer" containerID="e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3" Mar 18 11:06:44 crc kubenswrapper[4778]: E0318 11:06:44.262018 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3\": container with ID starting with e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3 not found: ID does not exist" containerID="e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3" Mar 18 11:06:44 crc kubenswrapper[4778]: I0318 11:06:44.262046 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3"} err="failed to get container status \"e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3\": rpc error: code = NotFound desc = could not find container \"e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3\": container with ID starting with e54ff6f208d0d33d6784f69f2447bff85e08c2453d19d2936aa41959955180e3 not found: ID does not exist" Mar 18 11:06:46 crc kubenswrapper[4778]: I0318 11:06:46.199672 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" path="/var/lib/kubelet/pods/c748da1f-65e2-4349-86f3-bfcc90cc7d1c/volumes" Mar 18 11:06:54 crc kubenswrapper[4778]: E0318 11:06:54.193767 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:06:56 crc kubenswrapper[4778]: I0318 11:06:56.187288 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:06:56 crc kubenswrapper[4778]: E0318 11:06:56.187938 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:07:08 crc kubenswrapper[4778]: I0318 11:07:08.187617 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:07:08 crc kubenswrapper[4778]: I0318 11:07:08.479980 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35"} Mar 18 11:07:38 crc kubenswrapper[4778]: I0318 11:07:38.304199 4778 scope.go:117] "RemoveContainer" containerID="4885ccf444eddc8a6ce122274d5f1476536fe5db0e3a880ac80a8141660c536e" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.158564 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563868-txd2q"] Mar 18 11:08:00 crc kubenswrapper[4778]: E0318 11:08:00.159982 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="extract-content" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.159997 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="extract-content" Mar 18 11:08:00 crc kubenswrapper[4778]: E0318 11:08:00.160014 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="registry-server" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160021 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="registry-server" Mar 18 11:08:00 crc kubenswrapper[4778]: E0318 11:08:00.160031 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="extract-utilities" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160039 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="extract-utilities" Mar 18 11:08:00 crc kubenswrapper[4778]: E0318 11:08:00.160060 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="gather" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160066 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="gather" Mar 18 11:08:00 crc kubenswrapper[4778]: E0318 11:08:00.160099 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="copy" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160105 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="copy" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160334 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c748da1f-65e2-4349-86f3-bfcc90cc7d1c" containerName="registry-server" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160347 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="gather" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.160360 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2e2094-7c48-4653-8b53-95483d470344" containerName="copy" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.161215 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.163876 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.163970 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.164799 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.170857 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563868-txd2q"] Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.268544 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w4fz\" (UniqueName: \"kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz\") pod \"auto-csr-approver-29563868-txd2q\" (UID: \"48c91868-ef15-4d6d-8547-1b2849d7aa95\") " pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.372171 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w4fz\" (UniqueName: \"kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz\") pod \"auto-csr-approver-29563868-txd2q\" (UID: \"48c91868-ef15-4d6d-8547-1b2849d7aa95\") " pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.397148 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w4fz\" (UniqueName: \"kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz\") pod \"auto-csr-approver-29563868-txd2q\" (UID: \"48c91868-ef15-4d6d-8547-1b2849d7aa95\") " pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.490534 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:00 crc kubenswrapper[4778]: I0318 11:08:00.982470 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563868-txd2q"] Mar 18 11:08:01 crc kubenswrapper[4778]: I0318 11:08:01.062343 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563868-txd2q" event={"ID":"48c91868-ef15-4d6d-8547-1b2849d7aa95","Type":"ContainerStarted","Data":"741f413682316d7f270de9e47aff7afe5389dd5b05acec7112518188196cc08f"} Mar 18 11:08:03 crc kubenswrapper[4778]: I0318 11:08:03.083906 4778 generic.go:334] "Generic (PLEG): container finished" podID="48c91868-ef15-4d6d-8547-1b2849d7aa95" containerID="ecc2f8a6686d5391d07b662f53f7a3bdd9927adf67509a229601871555c0b456" exitCode=0 Mar 18 11:08:03 crc kubenswrapper[4778]: I0318 11:08:03.084131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563868-txd2q" event={"ID":"48c91868-ef15-4d6d-8547-1b2849d7aa95","Type":"ContainerDied","Data":"ecc2f8a6686d5391d07b662f53f7a3bdd9927adf67509a229601871555c0b456"} Mar 18 11:08:03 crc kubenswrapper[4778]: E0318 11:08:03.188025 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:08:04 crc kubenswrapper[4778]: I0318 11:08:04.573977 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:04 crc kubenswrapper[4778]: I0318 11:08:04.680614 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w4fz\" (UniqueName: \"kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz\") pod \"48c91868-ef15-4d6d-8547-1b2849d7aa95\" (UID: \"48c91868-ef15-4d6d-8547-1b2849d7aa95\") " Mar 18 11:08:04 crc kubenswrapper[4778]: I0318 11:08:04.690795 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz" (OuterVolumeSpecName: "kube-api-access-4w4fz") pod "48c91868-ef15-4d6d-8547-1b2849d7aa95" (UID: "48c91868-ef15-4d6d-8547-1b2849d7aa95"). InnerVolumeSpecName "kube-api-access-4w4fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:08:04 crc kubenswrapper[4778]: I0318 11:08:04.783060 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w4fz\" (UniqueName: \"kubernetes.io/projected/48c91868-ef15-4d6d-8547-1b2849d7aa95-kube-api-access-4w4fz\") on node \"crc\" DevicePath \"\"" Mar 18 11:08:05 crc kubenswrapper[4778]: I0318 11:08:05.124440 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563868-txd2q" event={"ID":"48c91868-ef15-4d6d-8547-1b2849d7aa95","Type":"ContainerDied","Data":"741f413682316d7f270de9e47aff7afe5389dd5b05acec7112518188196cc08f"} Mar 18 11:08:05 crc kubenswrapper[4778]: I0318 11:08:05.124805 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="741f413682316d7f270de9e47aff7afe5389dd5b05acec7112518188196cc08f" Mar 18 11:08:05 crc kubenswrapper[4778]: I0318 11:08:05.124870 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-txd2q" Mar 18 11:08:05 crc kubenswrapper[4778]: E0318 11:08:05.366334 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c91868_ef15_4d6d_8547_1b2849d7aa95.slice/crio-741f413682316d7f270de9e47aff7afe5389dd5b05acec7112518188196cc08f\": RecentStats: unable to find data in memory cache]" Mar 18 11:08:05 crc kubenswrapper[4778]: I0318 11:08:05.642552 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-flgc8"] Mar 18 11:08:05 crc kubenswrapper[4778]: I0318 11:08:05.653277 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-flgc8"] Mar 18 11:08:06 crc kubenswrapper[4778]: I0318 11:08:06.196541 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9b6093-6849-4ee1-829c-3893c8efc355" path="/var/lib/kubelet/pods/0e9b6093-6849-4ee1-829c-3893c8efc355/volumes" Mar 18 11:08:38 crc kubenswrapper[4778]: I0318 11:08:38.469533 4778 scope.go:117] "RemoveContainer" containerID="3545c5320c999ca132cdbecfec3fe0adacffa5fbc99319fc9409f6ba39ed60ac" Mar 18 11:09:30 crc kubenswrapper[4778]: I0318 11:09:30.147455 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:09:30 crc kubenswrapper[4778]: I0318 11:09:30.147983 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:09:33 crc kubenswrapper[4778]: E0318 11:09:33.187576 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.694166 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4v7jl/must-gather-8j576"] Mar 18 11:09:56 crc kubenswrapper[4778]: E0318 11:09:56.696296 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c91868-ef15-4d6d-8547-1b2849d7aa95" containerName="oc" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.696421 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c91868-ef15-4d6d-8547-1b2849d7aa95" containerName="oc" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.696743 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c91868-ef15-4d6d-8547-1b2849d7aa95" containerName="oc" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.698129 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.703382 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4v7jl"/"kube-root-ca.crt" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.703408 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4v7jl"/"openshift-service-ca.crt" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.703398 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4v7jl"/"default-dockercfg-5dhft" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.710481 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4v7jl/must-gather-8j576"] Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.876570 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.876644 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjcph\" (UniqueName: \"kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.978972 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.979071 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjcph\" (UniqueName: \"kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:56 crc kubenswrapper[4778]: I0318 11:09:56.979496 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:57 crc kubenswrapper[4778]: I0318 11:09:57.010132 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjcph\" (UniqueName: \"kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph\") pod \"must-gather-8j576\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:57 crc kubenswrapper[4778]: I0318 11:09:57.019175 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:09:57 crc kubenswrapper[4778]: I0318 11:09:57.514711 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4v7jl/must-gather-8j576"] Mar 18 11:09:57 crc kubenswrapper[4778]: W0318 11:09:57.518463 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod339d23a2_4cea_4331_b745_44219b471d41.slice/crio-5bb4383e2e401a2463e2f6aeaf7fe15a8745ae6c18fb2b0acf1409a2e0af8ff0 WatchSource:0}: Error finding container 5bb4383e2e401a2463e2f6aeaf7fe15a8745ae6c18fb2b0acf1409a2e0af8ff0: Status 404 returned error can't find the container with id 5bb4383e2e401a2463e2f6aeaf7fe15a8745ae6c18fb2b0acf1409a2e0af8ff0 Mar 18 11:09:57 crc kubenswrapper[4778]: I0318 11:09:57.657353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/must-gather-8j576" event={"ID":"339d23a2-4cea-4331-b745-44219b471d41","Type":"ContainerStarted","Data":"5bb4383e2e401a2463e2f6aeaf7fe15a8745ae6c18fb2b0acf1409a2e0af8ff0"} Mar 18 11:09:58 crc kubenswrapper[4778]: I0318 11:09:58.667273 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/must-gather-8j576" event={"ID":"339d23a2-4cea-4331-b745-44219b471d41","Type":"ContainerStarted","Data":"07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310"} Mar 18 11:09:58 crc kubenswrapper[4778]: I0318 11:09:58.667575 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/must-gather-8j576" event={"ID":"339d23a2-4cea-4331-b745-44219b471d41","Type":"ContainerStarted","Data":"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0"} Mar 18 11:09:58 crc kubenswrapper[4778]: I0318 11:09:58.687888 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4v7jl/must-gather-8j576" podStartSLOduration=2.6878673060000002 podStartE2EDuration="2.687867306s" podCreationTimestamp="2026-03-18 11:09:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:09:58.682682015 +0000 UTC m=+7665.257426875" watchObservedRunningTime="2026-03-18 11:09:58.687867306 +0000 UTC m=+7665.262612146" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.147365 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.147942 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.153078 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563870-s7lhp"] Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.158138 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.162598 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.162770 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.162933 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.163451 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563870-s7lhp"] Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.249257 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdld7\" (UniqueName: \"kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7\") pod \"auto-csr-approver-29563870-s7lhp\" (UID: \"08809b1c-c749-4734-9fc4-6a0a755aa9cd\") " pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.351658 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdld7\" (UniqueName: \"kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7\") pod \"auto-csr-approver-29563870-s7lhp\" (UID: \"08809b1c-c749-4734-9fc4-6a0a755aa9cd\") " pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.376530 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdld7\" (UniqueName: \"kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7\") pod \"auto-csr-approver-29563870-s7lhp\" (UID: \"08809b1c-c749-4734-9fc4-6a0a755aa9cd\") " pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.539703 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:00 crc kubenswrapper[4778]: I0318 11:10:00.993032 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563870-s7lhp"] Mar 18 11:10:01 crc kubenswrapper[4778]: I0318 11:10:01.696293 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" event={"ID":"08809b1c-c749-4734-9fc4-6a0a755aa9cd","Type":"ContainerStarted","Data":"e0f1b0864e9fe4a32c395a265fea54f835e4d107f28e0a2e27d19f71a2fefbd4"} Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.047824 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-smwlp"] Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.048939 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.184402 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m6bc\" (UniqueName: \"kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.184742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.287424 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6bc\" (UniqueName: \"kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.287501 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.288245 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.309671 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m6bc\" (UniqueName: \"kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc\") pod \"crc-debug-smwlp\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.374576 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:02 crc kubenswrapper[4778]: W0318 11:10:02.421176 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod939aac29_edd6_4d03_a4f5_59541aa99ecd.slice/crio-30a87942c42ba58ae4535480008ce722d42f6808c3271d2c4c30afa2f5a8a212 WatchSource:0}: Error finding container 30a87942c42ba58ae4535480008ce722d42f6808c3271d2c4c30afa2f5a8a212: Status 404 returned error can't find the container with id 30a87942c42ba58ae4535480008ce722d42f6808c3271d2c4c30afa2f5a8a212 Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.707015 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" event={"ID":"08809b1c-c749-4734-9fc4-6a0a755aa9cd","Type":"ContainerStarted","Data":"7417bcdf486fe4210bba0dca5e997eafe86b7f08ceaf37548fb4760f00212acc"} Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.708664 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" event={"ID":"939aac29-edd6-4d03-a4f5-59541aa99ecd","Type":"ContainerStarted","Data":"b0c173a65daa3d6727011d9a85b81569efd5870086c307d8ad02c5186b648e01"} Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.708696 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" event={"ID":"939aac29-edd6-4d03-a4f5-59541aa99ecd","Type":"ContainerStarted","Data":"30a87942c42ba58ae4535480008ce722d42f6808c3271d2c4c30afa2f5a8a212"} Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.729392 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" podStartSLOduration=1.6047213980000001 podStartE2EDuration="2.729370004s" podCreationTimestamp="2026-03-18 11:10:00 +0000 UTC" firstStartedPulling="2026-03-18 11:10:00.998791648 +0000 UTC m=+7667.573536488" lastFinishedPulling="2026-03-18 11:10:02.123440234 +0000 UTC m=+7668.698185094" observedRunningTime="2026-03-18 11:10:02.723124045 +0000 UTC m=+7669.297868925" watchObservedRunningTime="2026-03-18 11:10:02.729370004 +0000 UTC m=+7669.304114844" Mar 18 11:10:02 crc kubenswrapper[4778]: I0318 11:10:02.742860 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" podStartSLOduration=0.74284247 podStartE2EDuration="742.84247ms" podCreationTimestamp="2026-03-18 11:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:10:02.735434448 +0000 UTC m=+7669.310179288" watchObservedRunningTime="2026-03-18 11:10:02.74284247 +0000 UTC m=+7669.317587310" Mar 18 11:10:03 crc kubenswrapper[4778]: I0318 11:10:03.718997 4778 generic.go:334] "Generic (PLEG): container finished" podID="08809b1c-c749-4734-9fc4-6a0a755aa9cd" containerID="7417bcdf486fe4210bba0dca5e997eafe86b7f08ceaf37548fb4760f00212acc" exitCode=0 Mar 18 11:10:03 crc kubenswrapper[4778]: I0318 11:10:03.719049 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" event={"ID":"08809b1c-c749-4734-9fc4-6a0a755aa9cd","Type":"ContainerDied","Data":"7417bcdf486fe4210bba0dca5e997eafe86b7f08ceaf37548fb4760f00212acc"} Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.251525 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.354880 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdld7\" (UniqueName: \"kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7\") pod \"08809b1c-c749-4734-9fc4-6a0a755aa9cd\" (UID: \"08809b1c-c749-4734-9fc4-6a0a755aa9cd\") " Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.375059 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7" (OuterVolumeSpecName: "kube-api-access-sdld7") pod "08809b1c-c749-4734-9fc4-6a0a755aa9cd" (UID: "08809b1c-c749-4734-9fc4-6a0a755aa9cd"). InnerVolumeSpecName "kube-api-access-sdld7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.459107 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdld7\" (UniqueName: \"kubernetes.io/projected/08809b1c-c749-4734-9fc4-6a0a755aa9cd-kube-api-access-sdld7\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.746858 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" event={"ID":"08809b1c-c749-4734-9fc4-6a0a755aa9cd","Type":"ContainerDied","Data":"e0f1b0864e9fe4a32c395a265fea54f835e4d107f28e0a2e27d19f71a2fefbd4"} Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.747195 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0f1b0864e9fe4a32c395a265fea54f835e4d107f28e0a2e27d19f71a2fefbd4" Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.747110 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-s7lhp" Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.806598 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-np9td"] Mar 18 11:10:05 crc kubenswrapper[4778]: I0318 11:10:05.819832 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-np9td"] Mar 18 11:10:06 crc kubenswrapper[4778]: I0318 11:10:06.200049 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ef8df9-98a4-4897-918d-b573fc50f7bb" path="/var/lib/kubelet/pods/93ef8df9-98a4-4897-918d-b573fc50f7bb/volumes" Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.147856 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.148504 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.148554 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.149416 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.149476 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35" gracePeriod=600 Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.972983 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35" exitCode=0 Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.973059 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35"} Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.973547 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434"} Mar 18 11:10:30 crc kubenswrapper[4778]: I0318 11:10:30.973572 4778 scope.go:117] "RemoveContainer" containerID="73b7ae7fc6219e56bd654692a48a89da89a7b394e1e024c36342c8dad1d886f9" Mar 18 11:10:38 crc kubenswrapper[4778]: I0318 11:10:38.596033 4778 scope.go:117] "RemoveContainer" containerID="806be4b53bf081091f4914e3382dec40c771f43c71215c8124342f2c0296cb37" Mar 18 11:10:44 crc kubenswrapper[4778]: I0318 11:10:44.079888 4778 generic.go:334] "Generic (PLEG): container finished" podID="939aac29-edd6-4d03-a4f5-59541aa99ecd" containerID="b0c173a65daa3d6727011d9a85b81569efd5870086c307d8ad02c5186b648e01" exitCode=0 Mar 18 11:10:44 crc kubenswrapper[4778]: I0318 11:10:44.079972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" event={"ID":"939aac29-edd6-4d03-a4f5-59541aa99ecd","Type":"ContainerDied","Data":"b0c173a65daa3d6727011d9a85b81569efd5870086c307d8ad02c5186b648e01"} Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.189779 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.237443 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-smwlp"] Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.245235 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-smwlp"] Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.301077 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m6bc\" (UniqueName: \"kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc\") pod \"939aac29-edd6-4d03-a4f5-59541aa99ecd\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.301351 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host\") pod \"939aac29-edd6-4d03-a4f5-59541aa99ecd\" (UID: \"939aac29-edd6-4d03-a4f5-59541aa99ecd\") " Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.301417 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host" (OuterVolumeSpecName: "host") pod "939aac29-edd6-4d03-a4f5-59541aa99ecd" (UID: "939aac29-edd6-4d03-a4f5-59541aa99ecd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.302894 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/939aac29-edd6-4d03-a4f5-59541aa99ecd-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.307667 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc" (OuterVolumeSpecName: "kube-api-access-5m6bc") pod "939aac29-edd6-4d03-a4f5-59541aa99ecd" (UID: "939aac29-edd6-4d03-a4f5-59541aa99ecd"). InnerVolumeSpecName "kube-api-access-5m6bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:10:45 crc kubenswrapper[4778]: I0318 11:10:45.405453 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m6bc\" (UniqueName: \"kubernetes.io/projected/939aac29-edd6-4d03-a4f5-59541aa99ecd-kube-api-access-5m6bc\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.099785 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30a87942c42ba58ae4535480008ce722d42f6808c3271d2c4c30afa2f5a8a212" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.100001 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-smwlp" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.199636 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="939aac29-edd6-4d03-a4f5-59541aa99ecd" path="/var/lib/kubelet/pods/939aac29-edd6-4d03-a4f5-59541aa99ecd/volumes" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.401602 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-6dhtw"] Mar 18 11:10:46 crc kubenswrapper[4778]: E0318 11:10:46.402002 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939aac29-edd6-4d03-a4f5-59541aa99ecd" containerName="container-00" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.402019 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="939aac29-edd6-4d03-a4f5-59541aa99ecd" containerName="container-00" Mar 18 11:10:46 crc kubenswrapper[4778]: E0318 11:10:46.402052 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08809b1c-c749-4734-9fc4-6a0a755aa9cd" containerName="oc" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.402058 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="08809b1c-c749-4734-9fc4-6a0a755aa9cd" containerName="oc" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.402232 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="939aac29-edd6-4d03-a4f5-59541aa99ecd" containerName="container-00" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.402257 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="08809b1c-c749-4734-9fc4-6a0a755aa9cd" containerName="oc" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.402846 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.526460 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.526547 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk2dh\" (UniqueName: \"kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.629334 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.629401 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk2dh\" (UniqueName: \"kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.629504 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.656117 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk2dh\" (UniqueName: \"kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh\") pod \"crc-debug-6dhtw\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: I0318 11:10:46.720591 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:46 crc kubenswrapper[4778]: W0318 11:10:46.753067 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481fa9ab_0ba7_4810_9a84_bb93c8762498.slice/crio-6d01cfea264e6eb627817e6ae0d743c3a37e4318bd87bcd78ce8d09f81529c8a WatchSource:0}: Error finding container 6d01cfea264e6eb627817e6ae0d743c3a37e4318bd87bcd78ce8d09f81529c8a: Status 404 returned error can't find the container with id 6d01cfea264e6eb627817e6ae0d743c3a37e4318bd87bcd78ce8d09f81529c8a Mar 18 11:10:47 crc kubenswrapper[4778]: I0318 11:10:47.109275 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" event={"ID":"481fa9ab-0ba7-4810-9a84-bb93c8762498","Type":"ContainerStarted","Data":"4244424fcad241de0fa7ed0597ece19cff3267ef03d0dbab6082ae06fe156bfa"} Mar 18 11:10:47 crc kubenswrapper[4778]: I0318 11:10:47.109633 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" event={"ID":"481fa9ab-0ba7-4810-9a84-bb93c8762498","Type":"ContainerStarted","Data":"6d01cfea264e6eb627817e6ae0d743c3a37e4318bd87bcd78ce8d09f81529c8a"} Mar 18 11:10:47 crc kubenswrapper[4778]: I0318 11:10:47.121368 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" podStartSLOduration=1.121351005 podStartE2EDuration="1.121351005s" podCreationTimestamp="2026-03-18 11:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 11:10:47.120546794 +0000 UTC m=+7713.695291634" watchObservedRunningTime="2026-03-18 11:10:47.121351005 +0000 UTC m=+7713.696095845" Mar 18 11:10:48 crc kubenswrapper[4778]: I0318 11:10:48.117302 4778 generic.go:334] "Generic (PLEG): container finished" podID="481fa9ab-0ba7-4810-9a84-bb93c8762498" containerID="4244424fcad241de0fa7ed0597ece19cff3267ef03d0dbab6082ae06fe156bfa" exitCode=0 Mar 18 11:10:48 crc kubenswrapper[4778]: I0318 11:10:48.117555 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" event={"ID":"481fa9ab-0ba7-4810-9a84-bb93c8762498","Type":"ContainerDied","Data":"4244424fcad241de0fa7ed0597ece19cff3267ef03d0dbab6082ae06fe156bfa"} Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.239761 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.380168 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk2dh\" (UniqueName: \"kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh\") pod \"481fa9ab-0ba7-4810-9a84-bb93c8762498\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.380349 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host\") pod \"481fa9ab-0ba7-4810-9a84-bb93c8762498\" (UID: \"481fa9ab-0ba7-4810-9a84-bb93c8762498\") " Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.380617 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host" (OuterVolumeSpecName: "host") pod "481fa9ab-0ba7-4810-9a84-bb93c8762498" (UID: "481fa9ab-0ba7-4810-9a84-bb93c8762498"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.381011 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/481fa9ab-0ba7-4810-9a84-bb93c8762498-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.388379 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh" (OuterVolumeSpecName: "kube-api-access-vk2dh") pod "481fa9ab-0ba7-4810-9a84-bb93c8762498" (UID: "481fa9ab-0ba7-4810-9a84-bb93c8762498"). InnerVolumeSpecName "kube-api-access-vk2dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.482334 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk2dh\" (UniqueName: \"kubernetes.io/projected/481fa9ab-0ba7-4810-9a84-bb93c8762498-kube-api-access-vk2dh\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.709654 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-6dhtw"] Mar 18 11:10:49 crc kubenswrapper[4778]: I0318 11:10:49.717442 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-6dhtw"] Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.147054 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d01cfea264e6eb627817e6ae0d743c3a37e4318bd87bcd78ce8d09f81529c8a" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.147120 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-6dhtw" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.197030 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481fa9ab-0ba7-4810-9a84-bb93c8762498" path="/var/lib/kubelet/pods/481fa9ab-0ba7-4810-9a84-bb93c8762498/volumes" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.881448 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-hvrms"] Mar 18 11:10:50 crc kubenswrapper[4778]: E0318 11:10:50.882143 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481fa9ab-0ba7-4810-9a84-bb93c8762498" containerName="container-00" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.882158 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="481fa9ab-0ba7-4810-9a84-bb93c8762498" containerName="container-00" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.882454 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="481fa9ab-0ba7-4810-9a84-bb93c8762498" containerName="container-00" Mar 18 11:10:50 crc kubenswrapper[4778]: I0318 11:10:50.883130 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.012435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.012581 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwvzt\" (UniqueName: \"kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.115512 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.115619 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwvzt\" (UniqueName: \"kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.115703 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.135384 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwvzt\" (UniqueName: \"kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt\") pod \"crc-debug-hvrms\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: I0318 11:10:51.205010 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:51 crc kubenswrapper[4778]: W0318 11:10:51.235380 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07b1b987_0fc7_4eb1_b6e3_2bc047d48992.slice/crio-ff888d19c52cc029981609bfcbbe53bf53880a3c77a2333cfce2657b67ef370d WatchSource:0}: Error finding container ff888d19c52cc029981609bfcbbe53bf53880a3c77a2333cfce2657b67ef370d: Status 404 returned error can't find the container with id ff888d19c52cc029981609bfcbbe53bf53880a3c77a2333cfce2657b67ef370d Mar 18 11:10:52 crc kubenswrapper[4778]: I0318 11:10:52.180308 4778 generic.go:334] "Generic (PLEG): container finished" podID="07b1b987-0fc7-4eb1-b6e3-2bc047d48992" containerID="22318d16c06934bfd283593ddd9c5161092c2f6f2dea9339dca1338b3f67afa1" exitCode=0 Mar 18 11:10:52 crc kubenswrapper[4778]: I0318 11:10:52.180458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" event={"ID":"07b1b987-0fc7-4eb1-b6e3-2bc047d48992","Type":"ContainerDied","Data":"22318d16c06934bfd283593ddd9c5161092c2f6f2dea9339dca1338b3f67afa1"} Mar 18 11:10:52 crc kubenswrapper[4778]: I0318 11:10:52.180634 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" event={"ID":"07b1b987-0fc7-4eb1-b6e3-2bc047d48992","Type":"ContainerStarted","Data":"ff888d19c52cc029981609bfcbbe53bf53880a3c77a2333cfce2657b67ef370d"} Mar 18 11:10:52 crc kubenswrapper[4778]: I0318 11:10:52.229314 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-hvrms"] Mar 18 11:10:52 crc kubenswrapper[4778]: I0318 11:10:52.239458 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4v7jl/crc-debug-hvrms"] Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.290756 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.461796 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host\") pod \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.462167 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwvzt\" (UniqueName: \"kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt\") pod \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\" (UID: \"07b1b987-0fc7-4eb1-b6e3-2bc047d48992\") " Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.461942 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host" (OuterVolumeSpecName: "host") pod "07b1b987-0fc7-4eb1-b6e3-2bc047d48992" (UID: "07b1b987-0fc7-4eb1-b6e3-2bc047d48992"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.462618 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.468014 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt" (OuterVolumeSpecName: "kube-api-access-pwvzt") pod "07b1b987-0fc7-4eb1-b6e3-2bc047d48992" (UID: "07b1b987-0fc7-4eb1-b6e3-2bc047d48992"). InnerVolumeSpecName "kube-api-access-pwvzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:10:53 crc kubenswrapper[4778]: I0318 11:10:53.564911 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwvzt\" (UniqueName: \"kubernetes.io/projected/07b1b987-0fc7-4eb1-b6e3-2bc047d48992-kube-api-access-pwvzt\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:54 crc kubenswrapper[4778]: I0318 11:10:54.197739 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b1b987-0fc7-4eb1-b6e3-2bc047d48992" path="/var/lib/kubelet/pods/07b1b987-0fc7-4eb1-b6e3-2bc047d48992/volumes" Mar 18 11:10:54 crc kubenswrapper[4778]: I0318 11:10:54.201485 4778 scope.go:117] "RemoveContainer" containerID="22318d16c06934bfd283593ddd9c5161092c2f6f2dea9339dca1338b3f67afa1" Mar 18 11:10:54 crc kubenswrapper[4778]: I0318 11:10:54.201535 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/crc-debug-hvrms" Mar 18 11:10:58 crc kubenswrapper[4778]: E0318 11:10:58.187250 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.071864 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ansibletest-ansibletest_1fb58f5e-1c8b-45e2-bf86-b81af58b66a9/ansibletest-ansibletest/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.205147 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d7458cd-cb86l_e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55/barbican-api/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.249562 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d7458cd-cb86l_e54bf6fd-2f41-46bf-a76e-b5ef5d75ee55/barbican-api-log/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.417383 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8bc77f476-tw7vd_3a006670-1a48-4421-8471-dd961c0e1d4c/barbican-keystone-listener/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.621939 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-769d964c9f-nxhk2_eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b/barbican-worker/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.676926 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-769d964c9f-nxhk2_eef83b3c-6c58-4a7a-ac72-bd6519c8cc2b/barbican-worker-log/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.856262 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-k7hgk_f4bddd5e-314b-49c0-963c-107e6798c40e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:45 crc kubenswrapper[4778]: I0318 11:11:45.950378 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8bc77f476-tw7vd_3a006670-1a48-4421-8471-dd961c0e1d4c/barbican-keystone-listener-log/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.024090 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/ceilometer-central-agent/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.089276 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/ceilometer-notification-agent/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.178965 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/proxy-httpd/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.231356 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_52bc493f-72e9-4387-9b91-13343fc7d550/sg-core/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.304414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-tvjcv_fed5a515-ed14-40f1-9282-4e87fe319bf6/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.398777 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-tpdwl_34acd7f6-6263-4871-892c-02835ebbab27/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.598507 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51/cinder-api/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.621582 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_924e0ec8-7cd0-4d72-a6ee-89e3e3a79c51/cinder-api-log/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.877682 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a419ad60-27c7-4a74-a7a0-f6b04b3bcb13/cinder-backup/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.887872 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a419ad60-27c7-4a74-a7a0-f6b04b3bcb13/probe/0.log" Mar 18 11:11:46 crc kubenswrapper[4778]: I0318 11:11:46.986235 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_bbde13ad-dacc-4f17-8da3-109ede6972c0/cinder-scheduler/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.145715 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_bbde13ad-dacc-4f17-8da3-109ede6972c0/probe/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.276185 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_81d18509-d2fc-47e2-b814-94c4807a4dd6/probe/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.304558 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_81d18509-d2fc-47e2-b814-94c4807a4dd6/cinder-volume/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.385424 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vwtd5_d44d6afe-0030-4d9d-9fa7-f75274eff578/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.511876 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-r64nk_4f5bf2d2-78b2-4358-a582-482ab3020da3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.616955 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/init/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.835748 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/init/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.841042 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd/glance-httpd/0.log" Mar 18 11:11:47 crc kubenswrapper[4778]: I0318 11:11:47.994489 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-pr7j8_78ee47c6-1b69-4162-8c4c-5d8f9f24e6c3/dnsmasq-dns/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.053362 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f5b8b9c-7cb9-4057-8e3a-3ec95bfdeadd/glance-log/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.101081 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a18a46b5-39a7-4da9-8994-5c4716bc0fc3/glance-httpd/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.171209 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a18a46b5-39a7-4da9-8994-5c4716bc0fc3/glance-log/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.322414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-644f48df4-b7jhq_e0a0a638-c445-4931-861e-d35704487c97/horizon/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.750486 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gfphd_7c70009e-cfb3-4598-9ae4-f1d90a2a63d5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:48 crc kubenswrapper[4778]: I0318 11:11:48.769837 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizontest-tests-horizontest_49ff1200-d42e-4022-990d-619169f357f4/horizontest-tests-horizontest/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.031484 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-9zw82_5e5ffed6-fceb-4d38-aa29-e9836a8d9f50/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.269651 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29563801-nctwn_8ace9f11-f4d8-4801-afa2-5b723d52d41e/keystone-cron/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.457683 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29563861-czdsg_d34b9add-0199-4bf1-81f8-fa4c2a9138e7/keystone-cron/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.566247 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1663e1b0-f9b0-4168-9386-abf2c1b56b43/kube-state-metrics/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.763145 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-644f48df4-b7jhq_e0a0a638-c445-4931-861e-d35704487c97/horizon-log/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.806317 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vjbtr_d50b5540-c2ca-4889-bbb0-3b5d04bc602f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:49 crc kubenswrapper[4778]: I0318 11:11:49.973753 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_35adb68e-2fb0-437c-bea7-e46f05e4918c/manila-api-log/0.log" Mar 18 11:11:50 crc kubenswrapper[4778]: I0318 11:11:50.226985 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e9af702d-3a1a-490e-82f5-e99c1718ef83/probe/0.log" Mar 18 11:11:50 crc kubenswrapper[4778]: I0318 11:11:50.243617 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e9af702d-3a1a-490e-82f5-e99c1718ef83/manila-scheduler/0.log" Mar 18 11:11:50 crc kubenswrapper[4778]: I0318 11:11:50.272662 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_35adb68e-2fb0-437c-bea7-e46f05e4918c/manila-api/0.log" Mar 18 11:11:50 crc kubenswrapper[4778]: I0318 11:11:50.456714 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_821dda0e-cde2-45a4-b23a-3d13565be515/probe/0.log" Mar 18 11:11:50 crc kubenswrapper[4778]: I0318 11:11:50.482022 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_821dda0e-cde2-45a4-b23a-3d13565be515/manila-share/0.log" Mar 18 11:11:51 crc kubenswrapper[4778]: I0318 11:11:51.206989 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7pj4v_52250b90-fbc6-418e-9a5f-4873d5fa5cd0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:51 crc kubenswrapper[4778]: I0318 11:11:51.719163 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d979499f7-4flxt_da263057-3652-4ae8-8435-4f80e4b13804/neutron-httpd/0.log" Mar 18 11:11:52 crc kubenswrapper[4778]: I0318 11:11:52.428479 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-75996d8fd4-jhtd2_4c045639-00d0-4ba6-9d75-c67934521e29/keystone-api/0.log" Mar 18 11:11:52 crc kubenswrapper[4778]: I0318 11:11:52.498988 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d979499f7-4flxt_da263057-3652-4ae8-8435-4f80e4b13804/neutron-api/0.log" Mar 18 11:11:53 crc kubenswrapper[4778]: I0318 11:11:53.418041 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9ba2a389-4009-4dab-bc75-45a574e50bbc/nova-cell1-conductor-conductor/0.log" Mar 18 11:11:53 crc kubenswrapper[4778]: I0318 11:11:53.465851 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3fc908a0-dc90-4df9-869c-5c0820cac423/nova-cell0-conductor-conductor/0.log" Mar 18 11:11:53 crc kubenswrapper[4778]: I0318 11:11:53.940572 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9549b39b-0fc5-4e89-b64a-de83c80735ed/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 11:11:54 crc kubenswrapper[4778]: I0318 11:11:54.149927 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-52cn9_b2db5491-57b4-427a-b306-5e525a1e7c27/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:54 crc kubenswrapper[4778]: I0318 11:11:54.468607 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28f01ca6-f7d2-4de3-9aa9-256803533b80/nova-metadata-log/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.441731 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a702c51-b7a6-4094-9d34-519102e1cf91/nova-api-log/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.578832 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9b1623d1-2084-419e-b36a-80930113a280/nova-scheduler-scheduler/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.731760 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28f01ca6-f7d2-4de3-9aa9-256803533b80/nova-metadata-metadata/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.821605 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/mysql-bootstrap/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.986658 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/mysql-bootstrap/0.log" Mar 18 11:11:55 crc kubenswrapper[4778]: I0318 11:11:55.993848 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_49ce9560-3ee2-48d2-b016-a9feefb3a798/galera/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.167572 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/mysql-bootstrap/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.323032 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/mysql-bootstrap/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.400347 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cfadc08e-9e77-4b6f-be89-fc7c726e85b7/galera/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.499325 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fec302c3-e5fc-4019-b4f5-50de6bdde59f/openstackclient/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.616052 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-djmq6_f58533cf-4c57-4c3a-b772-e2a488298d7e/ovn-controller/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.674576 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8a702c51-b7a6-4094-9d34-519102e1cf91/nova-api-api/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.834908 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-2ldk7_2c6e8f7b-9b48-4814-9e73-fc9833c26cc9/openstack-network-exporter/0.log" Mar 18 11:11:56 crc kubenswrapper[4778]: I0318 11:11:56.854333 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server-init/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.091085 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.094454 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovsdb-server-init/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.154518 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zrlnv_89cb0c73-439f-4178-bd96-f50b123bcd8a/ovs-vswitchd/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.313299 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7jqhd_1f0f4177-ad12-4848-bbd7-39b004344cb3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.415489 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac3419bd-88ba-4b83-bd93-ad5638bc7fd0/openstack-network-exporter/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.431694 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac3419bd-88ba-4b83-bd93-ad5638bc7fd0/ovn-northd/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.528277 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_113a3fc7-40a1-46f9-b93f-01a34fcaf4aa/openstack-network-exporter/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.617939 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_113a3fc7-40a1-46f9-b93f-01a34fcaf4aa/ovsdbserver-nb/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.731924 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_495e34ad-2f4d-46de-95e9-37b34a35f2d2/ovsdbserver-sb/0.log" Mar 18 11:11:57 crc kubenswrapper[4778]: I0318 11:11:57.741757 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_495e34ad-2f4d-46de-95e9-37b34a35f2d2/openstack-network-exporter/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.115583 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/setup-container/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.270501 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/setup-container/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.361668 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f9428cb3-4fdf-4b01-9368-28b413ecf82f/rabbitmq/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.496282 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7588d8786-t6x7l_fe0de426-6927-42ea-8b29-8bc01c27fe69/placement-api/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.577468 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/setup-container/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.618534 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7588d8786-t6x7l_fe0de426-6927-42ea-8b29-8bc01c27fe69/placement-log/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.740389 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/rabbitmq/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.758530 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0191a745-1fe2-4a1c-b007-96525ad39787/setup-container/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.877604 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lpzv7_613d0a31-a371-4c66-8254-85a7cc864fd0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:58 crc kubenswrapper[4778]: I0318 11:11:58.972912 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cdfnx_136dbfab-32f1-40ee-b685-74411fbc06ba/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.066407 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9bss8_80a8d263-9bba-4db0-928e-f633b4ad5314/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.245815 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j74ts_53b18647-af19-457c-9543-2156c1ace738/ssh-known-hosts-edpm-deployment/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.467706 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-full_757e3758-d646-4267-8c4c-b5efb0dcf709/tempest-tests-tempest-tests-runner/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.512921 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-test_c5a7a532-f8c2-4741-9892-65047a4cb225/tempest-tests-tempest-tests-runner/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.690807 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-horizontest-horizontest-tests-horizontest_3db5e33d-384f-4df3-bfb8-ba279b83f7e4/test-operator-logs-container/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.692326 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-ansibletest-ansibletest-ansibletest_1f57757d-6483-4e1a-9a09-e63026f73e70/test-operator-logs-container/0.log" Mar 18 11:11:59 crc kubenswrapper[4778]: I0318 11:11:59.938132 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_fb176b71-d782-4b0d-963f-94acef50cf11/test-operator-logs-container/0.log" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.022132 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tobiko-tobiko-tests-tobiko_4e028d5e-666c-497c-949e-97860410ad74/test-operator-logs-container/0.log" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.159369 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s00-podified-functional_5c0d8cb1-d7bc-4694-ac54-e0a9f8312557/tobiko-tests-tobiko/0.log" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.177652 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563872-w9xc8"] Mar 18 11:12:00 crc kubenswrapper[4778]: E0318 11:12:00.178054 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b1b987-0fc7-4eb1-b6e3-2bc047d48992" containerName="container-00" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.178071 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b1b987-0fc7-4eb1-b6e3-2bc047d48992" containerName="container-00" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.178630 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="07b1b987-0fc7-4eb1-b6e3-2bc047d48992" containerName="container-00" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.179328 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.181730 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.182004 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.182333 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.203062 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563872-w9xc8"] Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.247047 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s01-sanity_bd565818-8912-47ba-881f-f88011fa9b46/tobiko-tests-tobiko/0.log" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.281027 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvtdz\" (UniqueName: \"kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz\") pod \"auto-csr-approver-29563872-w9xc8\" (UID: \"c2575542-201b-40c8-baec-f64e53f357a6\") " pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.383093 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvtdz\" (UniqueName: \"kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz\") pod \"auto-csr-approver-29563872-w9xc8\" (UID: \"c2575542-201b-40c8-baec-f64e53f357a6\") " pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.404730 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvtdz\" (UniqueName: \"kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz\") pod \"auto-csr-approver-29563872-w9xc8\" (UID: \"c2575542-201b-40c8-baec-f64e53f357a6\") " pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.409617 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-44vc9_5e5ecb95-ba90-4f70-ae42-63e71026ffef/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 11:12:00 crc kubenswrapper[4778]: I0318 11:12:00.518265 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:01 crc kubenswrapper[4778]: I0318 11:12:01.045339 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 11:12:01 crc kubenswrapper[4778]: I0318 11:12:01.045720 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563872-w9xc8"] Mar 18 11:12:01 crc kubenswrapper[4778]: I0318 11:12:01.864601 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" event={"ID":"c2575542-201b-40c8-baec-f64e53f357a6","Type":"ContainerStarted","Data":"19da28c3a338cd150a9db77cc1bbccafffaa4df006145a305a3db3de00e58567"} Mar 18 11:12:02 crc kubenswrapper[4778]: I0318 11:12:02.877051 4778 generic.go:334] "Generic (PLEG): container finished" podID="c2575542-201b-40c8-baec-f64e53f357a6" containerID="9b9c4586ce364f21cf8a583a2a00575bf65854ca35b9f67e292350a899db8fd9" exitCode=0 Mar 18 11:12:02 crc kubenswrapper[4778]: I0318 11:12:02.877287 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" event={"ID":"c2575542-201b-40c8-baec-f64e53f357a6","Type":"ContainerDied","Data":"9b9c4586ce364f21cf8a583a2a00575bf65854ca35b9f67e292350a899db8fd9"} Mar 18 11:12:03 crc kubenswrapper[4778]: I0318 11:12:03.030031 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_fc50d224-cd65-4a46-b3d0-b40acdbda53d/memcached/0.log" Mar 18 11:12:03 crc kubenswrapper[4778]: E0318 11:12:03.188271 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.264958 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.366624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvtdz\" (UniqueName: \"kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz\") pod \"c2575542-201b-40c8-baec-f64e53f357a6\" (UID: \"c2575542-201b-40c8-baec-f64e53f357a6\") " Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.373430 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz" (OuterVolumeSpecName: "kube-api-access-pvtdz") pod "c2575542-201b-40c8-baec-f64e53f357a6" (UID: "c2575542-201b-40c8-baec-f64e53f357a6"). InnerVolumeSpecName "kube-api-access-pvtdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.469832 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvtdz\" (UniqueName: \"kubernetes.io/projected/c2575542-201b-40c8-baec-f64e53f357a6-kube-api-access-pvtdz\") on node \"crc\" DevicePath \"\"" Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.895057 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" event={"ID":"c2575542-201b-40c8-baec-f64e53f357a6","Type":"ContainerDied","Data":"19da28c3a338cd150a9db77cc1bbccafffaa4df006145a305a3db3de00e58567"} Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.895087 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563872-w9xc8" Mar 18 11:12:04 crc kubenswrapper[4778]: I0318 11:12:04.895101 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19da28c3a338cd150a9db77cc1bbccafffaa4df006145a305a3db3de00e58567" Mar 18 11:12:05 crc kubenswrapper[4778]: I0318 11:12:05.331737 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563866-8zx72"] Mar 18 11:12:05 crc kubenswrapper[4778]: I0318 11:12:05.341367 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563866-8zx72"] Mar 18 11:12:06 crc kubenswrapper[4778]: I0318 11:12:06.198918 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b734773-4a1f-4acd-80e9-e3cd0cf14c2c" path="/var/lib/kubelet/pods/6b734773-4a1f-4acd-80e9-e3cd0cf14c2c/volumes" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.512363 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:08 crc kubenswrapper[4778]: E0318 11:12:08.513118 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2575542-201b-40c8-baec-f64e53f357a6" containerName="oc" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.513131 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2575542-201b-40c8-baec-f64e53f357a6" containerName="oc" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.513355 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2575542-201b-40c8-baec-f64e53f357a6" containerName="oc" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.514712 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.526912 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.553785 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.553924 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ntf8\" (UniqueName: \"kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.553971 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.656328 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.656705 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ntf8\" (UniqueName: \"kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.656760 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.657114 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.657431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.687970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ntf8\" (UniqueName: \"kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8\") pod \"certified-operators-cp6wg\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:08 crc kubenswrapper[4778]: I0318 11:12:08.832688 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:09 crc kubenswrapper[4778]: I0318 11:12:09.218738 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:09 crc kubenswrapper[4778]: I0318 11:12:09.937290 4778 generic.go:334] "Generic (PLEG): container finished" podID="32705561-f15c-4a55-b595-5154e5c0f483" containerID="10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578" exitCode=0 Mar 18 11:12:09 crc kubenswrapper[4778]: I0318 11:12:09.937394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerDied","Data":"10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578"} Mar 18 11:12:09 crc kubenswrapper[4778]: I0318 11:12:09.938740 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerStarted","Data":"ec0a112d6285ed2e19ae645c8a2740504d15449ace3b08259fecdc0e181b8428"} Mar 18 11:12:10 crc kubenswrapper[4778]: I0318 11:12:10.947866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerStarted","Data":"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f"} Mar 18 11:12:12 crc kubenswrapper[4778]: I0318 11:12:12.965437 4778 generic.go:334] "Generic (PLEG): container finished" podID="32705561-f15c-4a55-b595-5154e5c0f483" containerID="0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f" exitCode=0 Mar 18 11:12:12 crc kubenswrapper[4778]: I0318 11:12:12.965497 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerDied","Data":"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f"} Mar 18 11:12:13 crc kubenswrapper[4778]: I0318 11:12:13.977115 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerStarted","Data":"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed"} Mar 18 11:12:14 crc kubenswrapper[4778]: I0318 11:12:14.001378 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cp6wg" podStartSLOduration=2.332215041 podStartE2EDuration="6.001360325s" podCreationTimestamp="2026-03-18 11:12:08 +0000 UTC" firstStartedPulling="2026-03-18 11:12:09.939990808 +0000 UTC m=+7796.514735648" lastFinishedPulling="2026-03-18 11:12:13.609136092 +0000 UTC m=+7800.183880932" observedRunningTime="2026-03-18 11:12:13.993765688 +0000 UTC m=+7800.568510558" watchObservedRunningTime="2026-03-18 11:12:14.001360325 +0000 UTC m=+7800.576105165" Mar 18 11:12:18 crc kubenswrapper[4778]: I0318 11:12:18.833413 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:18 crc kubenswrapper[4778]: I0318 11:12:18.834085 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:18 crc kubenswrapper[4778]: I0318 11:12:18.912327 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:19 crc kubenswrapper[4778]: I0318 11:12:19.068491 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:19 crc kubenswrapper[4778]: I0318 11:12:19.147071 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.037688 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cp6wg" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="registry-server" containerID="cri-o://0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed" gracePeriod=2 Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.570051 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.710447 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ntf8\" (UniqueName: \"kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8\") pod \"32705561-f15c-4a55-b595-5154e5c0f483\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.710548 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities\") pod \"32705561-f15c-4a55-b595-5154e5c0f483\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.710624 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content\") pod \"32705561-f15c-4a55-b595-5154e5c0f483\" (UID: \"32705561-f15c-4a55-b595-5154e5c0f483\") " Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.711587 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities" (OuterVolumeSpecName: "utilities") pod "32705561-f15c-4a55-b595-5154e5c0f483" (UID: "32705561-f15c-4a55-b595-5154e5c0f483"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.717223 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8" (OuterVolumeSpecName: "kube-api-access-8ntf8") pod "32705561-f15c-4a55-b595-5154e5c0f483" (UID: "32705561-f15c-4a55-b595-5154e5c0f483"). InnerVolumeSpecName "kube-api-access-8ntf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.812573 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ntf8\" (UniqueName: \"kubernetes.io/projected/32705561-f15c-4a55-b595-5154e5c0f483-kube-api-access-8ntf8\") on node \"crc\" DevicePath \"\"" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.812612 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.902135 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32705561-f15c-4a55-b595-5154e5c0f483" (UID: "32705561-f15c-4a55-b595-5154e5c0f483"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:12:21 crc kubenswrapper[4778]: I0318 11:12:21.913895 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32705561-f15c-4a55-b595-5154e5c0f483-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.049026 4778 generic.go:334] "Generic (PLEG): container finished" podID="32705561-f15c-4a55-b595-5154e5c0f483" containerID="0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed" exitCode=0 Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.049071 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerDied","Data":"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed"} Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.049102 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp6wg" event={"ID":"32705561-f15c-4a55-b595-5154e5c0f483","Type":"ContainerDied","Data":"ec0a112d6285ed2e19ae645c8a2740504d15449ace3b08259fecdc0e181b8428"} Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.049103 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp6wg" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.049124 4778 scope.go:117] "RemoveContainer" containerID="0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.068715 4778 scope.go:117] "RemoveContainer" containerID="0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.098091 4778 scope.go:117] "RemoveContainer" containerID="10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.101509 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.110523 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cp6wg"] Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.147539 4778 scope.go:117] "RemoveContainer" containerID="0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed" Mar 18 11:12:22 crc kubenswrapper[4778]: E0318 11:12:22.148095 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed\": container with ID starting with 0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed not found: ID does not exist" containerID="0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.148134 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed"} err="failed to get container status \"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed\": rpc error: code = NotFound desc = could not find container \"0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed\": container with ID starting with 0abfa9903a8a80380527207124f642c4461202df4ad6f843d54dbc10d76befed not found: ID does not exist" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.148159 4778 scope.go:117] "RemoveContainer" containerID="0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f" Mar 18 11:12:22 crc kubenswrapper[4778]: E0318 11:12:22.148578 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f\": container with ID starting with 0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f not found: ID does not exist" containerID="0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.148619 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f"} err="failed to get container status \"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f\": rpc error: code = NotFound desc = could not find container \"0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f\": container with ID starting with 0eb60c297bcec9b72dfdb9a15a0653e4795e8301db7fa693a486e0ac6076574f not found: ID does not exist" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.148645 4778 scope.go:117] "RemoveContainer" containerID="10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578" Mar 18 11:12:22 crc kubenswrapper[4778]: E0318 11:12:22.155374 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578\": container with ID starting with 10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578 not found: ID does not exist" containerID="10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.155441 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578"} err="failed to get container status \"10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578\": rpc error: code = NotFound desc = could not find container \"10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578\": container with ID starting with 10209b15c989a89684050c875b90fd219031baf835fa69accb27f48dda488578 not found: ID does not exist" Mar 18 11:12:22 crc kubenswrapper[4778]: I0318 11:12:22.202126 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32705561-f15c-4a55-b595-5154e5c0f483" path="/var/lib/kubelet/pods/32705561-f15c-4a55-b595-5154e5c0f483/volumes" Mar 18 11:12:23 crc kubenswrapper[4778]: I0318 11:12:23.896913 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-fsxlt_3390909b-6271-40dd-9662-0710f6866143/manager/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.174824 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-7mbx2_710ababb-0bee-441d-8dd0-e6a72ea2b2e3/manager/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.335161 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.581849 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.625981 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.694381 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.846544 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/util/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.889110 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/pull/0.log" Mar 18 11:12:24 crc kubenswrapper[4778]: I0318 11:12:24.961943 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f7515a08ca357f4fd4a585f40a6d95495c80b8d04af53a0267e8c07255jpxzb_bf055ff8-8bbd-4628-a5ad-c765775e8f16/extract/0.log" Mar 18 11:12:25 crc kubenswrapper[4778]: I0318 11:12:25.150838 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-wb4pc_b41dbd4a-33dd-4dca-9356-34c740e8063f/manager/0.log" Mar 18 11:12:25 crc kubenswrapper[4778]: I0318 11:12:25.282126 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-t5c4w_aceb2f7b-585f-451a-83b8-e673965ada87/manager/0.log" Mar 18 11:12:25 crc kubenswrapper[4778]: I0318 11:12:25.392369 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-x7rnp_124dc549-cb2a-4b1c-a610-093cf9b8c05d/manager/0.log" Mar 18 11:12:25 crc kubenswrapper[4778]: I0318 11:12:25.771669 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-fjjvl_3c86f76c-1617-45e9-9573-f6fd51803b45/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.051876 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-5xvtc_e1ec7bae-8e15-4844-84d2-ff5951d0be31/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.062235 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-64c4x_66d3bf3a-086c-4340-ba73-209f526fc33c/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.291248 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-zpc92_211c991a-9406-4360-aa7f-830be3aa55db/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.390691 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-wxftc_0526f654-9ddc-4495-bb04-be13e53b6a1b/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.430878 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-47sbc_37675366-70a8-4e0b-b92b-f7055547d918/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.541225 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-k4r2p_ae690990-eeb1-4871-8c51-dd3b547e1193/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.712300 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-h6whs_e245908e-e35e-403c-93f6-48371904ae42/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.758104 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-pzjdt_c776af1e-ad54-40fe-9bed-a0a09ce0eea7/manager/0.log" Mar 18 11:12:26 crc kubenswrapper[4778]: I0318 11:12:26.883725 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-xdgmv_80822932-2943-4f81-9436-1553ed031359/manager/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.064411 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-654f4fc7f7-9d4pb_b8267dff-2541-481e-bc64-13eb8d19300b/operator/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.227303 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-v7qxm_c508c810-232f-48c1-8d15-bbbb118d2948/registry-server/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.343658 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-fgfk9_208b26f2-3c91-4966-9d01-8fe73e4a7d87/manager/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.516106 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-d5w9q_2f8e8860-00a1-43fc-9776-c617f270cc50/manager/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.675311 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5jrv8_b837636e-8c09-42b7-9a81-e7875df68344/operator/0.log" Mar 18 11:12:27 crc kubenswrapper[4778]: I0318 11:12:27.797242 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-c6l5k_8ccabb3b-da59-4ab0-89c8-99094a939f0d/manager/0.log" Mar 18 11:12:28 crc kubenswrapper[4778]: I0318 11:12:28.142944 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-tx9zq_9d3ce48f-cf4b-4e0f-908d-afacfc6c3b77/manager/0.log" Mar 18 11:12:28 crc kubenswrapper[4778]: I0318 11:12:28.213367 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-54c5f5bc8-jsm76_99adb6be-2a3e-4148-8074-9258222ebd60/manager/0.log" Mar 18 11:12:28 crc kubenswrapper[4778]: I0318 11:12:28.345109 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-sgs49_57277339-c9be-4de1-8e35-72ae98d33905/manager/0.log" Mar 18 11:12:28 crc kubenswrapper[4778]: I0318 11:12:28.347037 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f5c7df4d7-m4kvr_3c7e3158-5139-467d-b33c-808747f0d9be/manager/0.log" Mar 18 11:12:30 crc kubenswrapper[4778]: I0318 11:12:30.148081 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:12:30 crc kubenswrapper[4778]: I0318 11:12:30.148582 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:12:38 crc kubenswrapper[4778]: I0318 11:12:38.714103 4778 scope.go:117] "RemoveContainer" containerID="9ce4ad858c60f25a18c86f0360777510f04c706cb5eafb4da8787fc9df1829e5" Mar 18 11:12:47 crc kubenswrapper[4778]: I0318 11:12:47.076685 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qtggn_ba84f396-0169-4d5e-a126-60ac9d6d49f8/control-plane-machine-set-operator/0.log" Mar 18 11:12:47 crc kubenswrapper[4778]: I0318 11:12:47.336924 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-smtz9_f06790e0-cf8c-48f0-8d48-893663fdbd1c/kube-rbac-proxy/0.log" Mar 18 11:12:47 crc kubenswrapper[4778]: I0318 11:12:47.377069 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-smtz9_f06790e0-cf8c-48f0-8d48-893663fdbd1c/machine-api-operator/0.log" Mar 18 11:13:00 crc kubenswrapper[4778]: I0318 11:13:00.147630 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:13:00 crc kubenswrapper[4778]: I0318 11:13:00.148280 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:13:00 crc kubenswrapper[4778]: I0318 11:13:00.321491 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-qrqw4_e39be52c-c244-44cc-a707-0ec9994991fa/cert-manager-controller/0.log" Mar 18 11:13:00 crc kubenswrapper[4778]: I0318 11:13:00.515121 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-khqrg_24a88e8d-e986-4b3d-a77e-1a3e5162ac9c/cert-manager-cainjector/0.log" Mar 18 11:13:00 crc kubenswrapper[4778]: I0318 11:13:00.541193 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-hjskg_f09bc4b7-d305-4674-8540-283bd0b4901c/cert-manager-webhook/0.log" Mar 18 11:13:13 crc kubenswrapper[4778]: I0318 11:13:13.540061 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-22c9p_8b636ef7-4b85-4506-bb2a-f89bee9b028d/nmstate-console-plugin/0.log" Mar 18 11:13:13 crc kubenswrapper[4778]: I0318 11:13:13.688399 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5thsf_5b97fa25-4d3d-4664-a5fc-41c98bbd272f/nmstate-handler/0.log" Mar 18 11:13:13 crc kubenswrapper[4778]: I0318 11:13:13.729304 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wq8gr_71b50b27-6084-4693-acbc-d14f36759618/kube-rbac-proxy/0.log" Mar 18 11:13:13 crc kubenswrapper[4778]: I0318 11:13:13.780742 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wq8gr_71b50b27-6084-4693-acbc-d14f36759618/nmstate-metrics/0.log" Mar 18 11:13:13 crc kubenswrapper[4778]: I0318 11:13:13.959410 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-sr9ls_1dd7ccb2-0dca-4a6d-87f7-195b0ae0f9fe/nmstate-operator/0.log" Mar 18 11:13:14 crc kubenswrapper[4778]: I0318 11:13:14.010366 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-thw7f_5961b98d-a41a-4ceb-bb71-4bf3a0fc854d/nmstate-webhook/0.log" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.749309 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:23 crc kubenswrapper[4778]: E0318 11:13:23.750052 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="extract-utilities" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.750064 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="extract-utilities" Mar 18 11:13:23 crc kubenswrapper[4778]: E0318 11:13:23.750082 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="registry-server" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.750088 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="registry-server" Mar 18 11:13:23 crc kubenswrapper[4778]: E0318 11:13:23.750115 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="extract-content" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.750121 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="extract-content" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.750310 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="32705561-f15c-4a55-b595-5154e5c0f483" containerName="registry-server" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.751499 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.765695 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.937979 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwhs5\" (UniqueName: \"kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.938959 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:23 crc kubenswrapper[4778]: I0318 11:13:23.939097 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.040971 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.041151 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwhs5\" (UniqueName: \"kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.041203 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.041473 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.041538 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.062786 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwhs5\" (UniqueName: \"kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5\") pod \"community-operators-k6xz8\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.072729 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.611465 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:24 crc kubenswrapper[4778]: W0318 11:13:24.612514 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b138f36_1b83_46c2_bcff_84a0f03d3921.slice/crio-780daef24cf10bd44e52b7e1feeb9ea337d24c2ed2241a25071fdf982ddb4787 WatchSource:0}: Error finding container 780daef24cf10bd44e52b7e1feeb9ea337d24c2ed2241a25071fdf982ddb4787: Status 404 returned error can't find the container with id 780daef24cf10bd44e52b7e1feeb9ea337d24c2ed2241a25071fdf982ddb4787 Mar 18 11:13:24 crc kubenswrapper[4778]: I0318 11:13:24.676069 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerStarted","Data":"780daef24cf10bd44e52b7e1feeb9ea337d24c2ed2241a25071fdf982ddb4787"} Mar 18 11:13:25 crc kubenswrapper[4778]: I0318 11:13:25.684932 4778 generic.go:334] "Generic (PLEG): container finished" podID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerID="91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76" exitCode=0 Mar 18 11:13:25 crc kubenswrapper[4778]: I0318 11:13:25.684984 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerDied","Data":"91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76"} Mar 18 11:13:27 crc kubenswrapper[4778]: I0318 11:13:27.702488 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerStarted","Data":"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33"} Mar 18 11:13:28 crc kubenswrapper[4778]: E0318 11:13:28.187122 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:13:28 crc kubenswrapper[4778]: I0318 11:13:28.713592 4778 generic.go:334] "Generic (PLEG): container finished" podID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerID="270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33" exitCode=0 Mar 18 11:13:28 crc kubenswrapper[4778]: I0318 11:13:28.713660 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerDied","Data":"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33"} Mar 18 11:13:29 crc kubenswrapper[4778]: I0318 11:13:29.725955 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerStarted","Data":"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390"} Mar 18 11:13:29 crc kubenswrapper[4778]: I0318 11:13:29.754113 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k6xz8" podStartSLOduration=3.277500829 podStartE2EDuration="6.754094914s" podCreationTimestamp="2026-03-18 11:13:23 +0000 UTC" firstStartedPulling="2026-03-18 11:13:25.68789368 +0000 UTC m=+7872.262638520" lastFinishedPulling="2026-03-18 11:13:29.164487765 +0000 UTC m=+7875.739232605" observedRunningTime="2026-03-18 11:13:29.74659213 +0000 UTC m=+7876.321336970" watchObservedRunningTime="2026-03-18 11:13:29.754094914 +0000 UTC m=+7876.328839744" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.147263 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.147540 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.147584 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.148382 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.148438 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" gracePeriod=600 Mar 18 11:13:30 crc kubenswrapper[4778]: E0318 11:13:30.303937 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.736408 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" exitCode=0 Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.736505 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434"} Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.736588 4778 scope.go:117] "RemoveContainer" containerID="cbe46d0fa65f0ead7ad1789d13ec33e6d93d408d865f08e3bbc666ffb661db35" Mar 18 11:13:30 crc kubenswrapper[4778]: I0318 11:13:30.737680 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:13:30 crc kubenswrapper[4778]: E0318 11:13:30.738053 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:13:34 crc kubenswrapper[4778]: I0318 11:13:34.073076 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:34 crc kubenswrapper[4778]: I0318 11:13:34.073534 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:34 crc kubenswrapper[4778]: I0318 11:13:34.160394 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:34 crc kubenswrapper[4778]: I0318 11:13:34.867404 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:34 crc kubenswrapper[4778]: I0318 11:13:34.933718 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:36 crc kubenswrapper[4778]: I0318 11:13:36.838070 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k6xz8" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="registry-server" containerID="cri-o://6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390" gracePeriod=2 Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.278942 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.409398 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities\") pod \"7b138f36-1b83-46c2-bcff-84a0f03d3921\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.409514 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content\") pod \"7b138f36-1b83-46c2-bcff-84a0f03d3921\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.409708 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwhs5\" (UniqueName: \"kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5\") pod \"7b138f36-1b83-46c2-bcff-84a0f03d3921\" (UID: \"7b138f36-1b83-46c2-bcff-84a0f03d3921\") " Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.422147 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities" (OuterVolumeSpecName: "utilities") pod "7b138f36-1b83-46c2-bcff-84a0f03d3921" (UID: "7b138f36-1b83-46c2-bcff-84a0f03d3921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.422260 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5" (OuterVolumeSpecName: "kube-api-access-zwhs5") pod "7b138f36-1b83-46c2-bcff-84a0f03d3921" (UID: "7b138f36-1b83-46c2-bcff-84a0f03d3921"). InnerVolumeSpecName "kube-api-access-zwhs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.465914 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b138f36-1b83-46c2-bcff-84a0f03d3921" (UID: "7b138f36-1b83-46c2-bcff-84a0f03d3921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.512407 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwhs5\" (UniqueName: \"kubernetes.io/projected/7b138f36-1b83-46c2-bcff-84a0f03d3921-kube-api-access-zwhs5\") on node \"crc\" DevicePath \"\"" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.512444 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.512457 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b138f36-1b83-46c2-bcff-84a0f03d3921-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.853003 4778 generic.go:334] "Generic (PLEG): container finished" podID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerID="6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390" exitCode=0 Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.853050 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerDied","Data":"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390"} Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.853083 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6xz8" event={"ID":"7b138f36-1b83-46c2-bcff-84a0f03d3921","Type":"ContainerDied","Data":"780daef24cf10bd44e52b7e1feeb9ea337d24c2ed2241a25071fdf982ddb4787"} Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.853102 4778 scope.go:117] "RemoveContainer" containerID="6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.853287 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6xz8" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.879518 4778 scope.go:117] "RemoveContainer" containerID="270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.890326 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.901510 4778 scope.go:117] "RemoveContainer" containerID="91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.902295 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k6xz8"] Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.942745 4778 scope.go:117] "RemoveContainer" containerID="6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390" Mar 18 11:13:37 crc kubenswrapper[4778]: E0318 11:13:37.943124 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390\": container with ID starting with 6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390 not found: ID does not exist" containerID="6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.943164 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390"} err="failed to get container status \"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390\": rpc error: code = NotFound desc = could not find container \"6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390\": container with ID starting with 6f01e966dad6d1895af428dac5a64fceb9abcda38162511ff673951f58cb1390 not found: ID does not exist" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.943190 4778 scope.go:117] "RemoveContainer" containerID="270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33" Mar 18 11:13:37 crc kubenswrapper[4778]: E0318 11:13:37.943554 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33\": container with ID starting with 270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33 not found: ID does not exist" containerID="270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.943583 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33"} err="failed to get container status \"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33\": rpc error: code = NotFound desc = could not find container \"270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33\": container with ID starting with 270e327af43bf2fb7926b6c6f6a0c75950656e810dfbc34a3b94a138c0b77d33 not found: ID does not exist" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.943605 4778 scope.go:117] "RemoveContainer" containerID="91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76" Mar 18 11:13:37 crc kubenswrapper[4778]: E0318 11:13:37.943896 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76\": container with ID starting with 91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76 not found: ID does not exist" containerID="91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76" Mar 18 11:13:37 crc kubenswrapper[4778]: I0318 11:13:37.943936 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76"} err="failed to get container status \"91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76\": rpc error: code = NotFound desc = could not find container \"91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76\": container with ID starting with 91103dae6e6b95109fd2afea13b2d9b6974d429d0fc9f74cf2bd3fe62794dd76 not found: ID does not exist" Mar 18 11:13:38 crc kubenswrapper[4778]: I0318 11:13:38.196920 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" path="/var/lib/kubelet/pods/7b138f36-1b83-46c2-bcff-84a0f03d3921/volumes" Mar 18 11:13:42 crc kubenswrapper[4778]: I0318 11:13:42.642414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-sv9kd_1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c/kube-rbac-proxy/0.log" Mar 18 11:13:42 crc kubenswrapper[4778]: I0318 11:13:42.730635 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-sv9kd_1ddcd9d2-a5d0-4773-93f5-8eb9c0fff72c/controller/0.log" Mar 18 11:13:42 crc kubenswrapper[4778]: I0318 11:13:42.869287 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.025767 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.083345 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.084954 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.113919 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.295758 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.297994 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.331357 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.362347 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.508118 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-frr-files/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.555142 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-metrics/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.620469 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/cp-reloader/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.637029 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/controller/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.742069 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/frr-metrics/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.842709 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/kube-rbac-proxy/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.858459 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/kube-rbac-proxy-frr/0.log" Mar 18 11:13:43 crc kubenswrapper[4778]: I0318 11:13:43.948281 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/reloader/0.log" Mar 18 11:13:44 crc kubenswrapper[4778]: I0318 11:13:44.116712 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jrtjv_0f18e9f0-b3eb-440a-b035-ed8256df5ed9/frr-k8s-webhook-server/0.log" Mar 18 11:13:44 crc kubenswrapper[4778]: I0318 11:13:44.272005 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78856dcdc4-9cltx_721ee07f-fded-43ab-9bb7-2e4e56c98515/manager/0.log" Mar 18 11:13:44 crc kubenswrapper[4778]: I0318 11:13:44.361287 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b499db45c-c5tcr_75885bb8-adce-4801-8941-75042ab330ea/webhook-server/0.log" Mar 18 11:13:44 crc kubenswrapper[4778]: I0318 11:13:44.780344 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wd69x_1c97662e-d673-42c1-a6ad-75865ba2b8b6/kube-rbac-proxy/0.log" Mar 18 11:13:45 crc kubenswrapper[4778]: I0318 11:13:45.187755 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:13:45 crc kubenswrapper[4778]: E0318 11:13:45.188075 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:13:45 crc kubenswrapper[4778]: I0318 11:13:45.360944 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wd69x_1c97662e-d673-42c1-a6ad-75865ba2b8b6/speaker/0.log" Mar 18 11:13:46 crc kubenswrapper[4778]: I0318 11:13:46.025744 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g2q8m_5efed87b-ad9c-4703-b3c4-2d6ab8d0883b/frr/0.log" Mar 18 11:13:58 crc kubenswrapper[4778]: I0318 11:13:58.188446 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:13:58 crc kubenswrapper[4778]: E0318 11:13:58.189167 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.110019 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.339957 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.341638 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.444920 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.592783 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/util/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.604054 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/extract/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.616538 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874t9x49_85a942ea-cebf-408c-95b8-f435630b20ad/pull/0.log" Mar 18 11:13:59 crc kubenswrapper[4778]: I0318 11:13:59.776087 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.009785 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.047098 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.072137 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.156871 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563874-hj4l4"] Mar 18 11:14:00 crc kubenswrapper[4778]: E0318 11:14:00.157332 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="registry-server" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.157348 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="registry-server" Mar 18 11:14:00 crc kubenswrapper[4778]: E0318 11:14:00.157374 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="extract-content" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.157381 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="extract-content" Mar 18 11:14:00 crc kubenswrapper[4778]: E0318 11:14:00.157402 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="extract-utilities" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.157411 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="extract-utilities" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.157685 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b138f36-1b83-46c2-bcff-84a0f03d3921" containerName="registry-server" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.158519 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.160213 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.161083 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.161888 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.173893 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563874-hj4l4"] Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.250405 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/extract/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.273854 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95jdw\" (UniqueName: \"kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw\") pod \"auto-csr-approver-29563874-hj4l4\" (UID: \"894be30f-4dc9-4a4c-b443-2393b89df180\") " pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.295666 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/util/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.317125 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bljjl_2416fdd2-138d-4320-8ff6-47f621e093a9/pull/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.376131 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95jdw\" (UniqueName: \"kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw\") pod \"auto-csr-approver-29563874-hj4l4\" (UID: \"894be30f-4dc9-4a4c-b443-2393b89df180\") " pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.399016 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95jdw\" (UniqueName: \"kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw\") pod \"auto-csr-approver-29563874-hj4l4\" (UID: \"894be30f-4dc9-4a4c-b443-2393b89df180\") " pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.437047 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.477085 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.682371 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.724395 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:14:00 crc kubenswrapper[4778]: I0318 11:14:00.736272 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.011819 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563874-hj4l4"] Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.044445 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" event={"ID":"894be30f-4dc9-4a4c-b443-2393b89df180","Type":"ContainerStarted","Data":"552bca3d7cacd76ec3cc9a9d5dae7c12a8765fdd6cc731d4701158a4a76c124e"} Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.150800 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-content/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.197544 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/extract-utilities/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.384860 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.569872 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.584409 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.682332 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.874879 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-utilities/0.log" Mar 18 11:14:01 crc kubenswrapper[4778]: I0318 11:14:01.933958 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/extract-content/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.081759 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csm2z_83efc97a-1a91-4bc8-90bf-a78bc8ee90e3/registry-server/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.130415 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jj774_e037e8cd-1543-49a8-9389-4cc6f440c4b3/marketplace-operator/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.337937 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.527514 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.641071 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.719423 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:14:02 crc kubenswrapper[4778]: I0318 11:14:02.754850 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jztl6_0c0dfa2e-b334-4eed-9e2f-3097f2b5102a/registry-server/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.081016 4778 generic.go:334] "Generic (PLEG): container finished" podID="894be30f-4dc9-4a4c-b443-2393b89df180" containerID="8d49e3ed1fe8885369ea3edca8ac073df8e6a13bd130a54cf891e878c3833fb8" exitCode=0 Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.081057 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" event={"ID":"894be30f-4dc9-4a4c-b443-2393b89df180","Type":"ContainerDied","Data":"8d49e3ed1fe8885369ea3edca8ac073df8e6a13bd130a54cf891e878c3833fb8"} Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.179495 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.181469 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.190671 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.209385 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-utilities/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.209719 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/extract-content/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.234830 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.234912 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.234967 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8tm\" (UniqueName: \"kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.336623 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.336692 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.336780 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8tm\" (UniqueName: \"kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.338708 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.341307 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.355421 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8tm\" (UniqueName: \"kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm\") pod \"redhat-operators-5fvxm\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.470282 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.518008 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.542298 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xs85d_0eaac9b5-67d6-4187-b118-0add20190689/registry-server/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.831125 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.944609 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:14:03 crc kubenswrapper[4778]: I0318 11:14:03.975390 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.078656 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.138866 4778 generic.go:334] "Generic (PLEG): container finished" podID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerID="18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e" exitCode=0 Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.139277 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerDied","Data":"18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e"} Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.139332 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerStarted","Data":"eceaba1cadc94da3645fb8bab4d0ecd5290d1d81ac72c2392e76e15fcc8275b3"} Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.169934 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-utilities/0.log" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.230867 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/extract-content/0.log" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.636863 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.692827 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95jdw\" (UniqueName: \"kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw\") pod \"894be30f-4dc9-4a4c-b443-2393b89df180\" (UID: \"894be30f-4dc9-4a4c-b443-2393b89df180\") " Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.698966 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw" (OuterVolumeSpecName: "kube-api-access-95jdw") pod "894be30f-4dc9-4a4c-b443-2393b89df180" (UID: "894be30f-4dc9-4a4c-b443-2393b89df180"). InnerVolumeSpecName "kube-api-access-95jdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:14:04 crc kubenswrapper[4778]: I0318 11:14:04.799124 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95jdw\" (UniqueName: \"kubernetes.io/projected/894be30f-4dc9-4a4c-b443-2393b89df180-kube-api-access-95jdw\") on node \"crc\" DevicePath \"\"" Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.152934 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" event={"ID":"894be30f-4dc9-4a4c-b443-2393b89df180","Type":"ContainerDied","Data":"552bca3d7cacd76ec3cc9a9d5dae7c12a8765fdd6cc731d4701158a4a76c124e"} Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.153258 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552bca3d7cacd76ec3cc9a9d5dae7c12a8765fdd6cc731d4701158a4a76c124e" Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.153308 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563874-hj4l4" Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.176837 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9b8p9_f9a557a7-2d98-4e56-8119-acfd64357871/registry-server/0.log" Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.703909 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563868-txd2q"] Mar 18 11:14:05 crc kubenswrapper[4778]: I0318 11:14:05.718042 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563868-txd2q"] Mar 18 11:14:06 crc kubenswrapper[4778]: I0318 11:14:06.163907 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerStarted","Data":"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182"} Mar 18 11:14:06 crc kubenswrapper[4778]: I0318 11:14:06.197308 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c91868-ef15-4d6d-8547-1b2849d7aa95" path="/var/lib/kubelet/pods/48c91868-ef15-4d6d-8547-1b2849d7aa95/volumes" Mar 18 11:14:09 crc kubenswrapper[4778]: I0318 11:14:09.187454 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:14:09 crc kubenswrapper[4778]: E0318 11:14:09.188325 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:14:11 crc kubenswrapper[4778]: I0318 11:14:11.212334 4778 generic.go:334] "Generic (PLEG): container finished" podID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerID="804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182" exitCode=0 Mar 18 11:14:11 crc kubenswrapper[4778]: I0318 11:14:11.212425 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerDied","Data":"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182"} Mar 18 11:14:12 crc kubenswrapper[4778]: I0318 11:14:12.230181 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerStarted","Data":"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84"} Mar 18 11:14:12 crc kubenswrapper[4778]: I0318 11:14:12.288536 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5fvxm" podStartSLOduration=1.753011698 podStartE2EDuration="9.288512747s" podCreationTimestamp="2026-03-18 11:14:03 +0000 UTC" firstStartedPulling="2026-03-18 11:14:04.142568931 +0000 UTC m=+7910.717313771" lastFinishedPulling="2026-03-18 11:14:11.67806998 +0000 UTC m=+7918.252814820" observedRunningTime="2026-03-18 11:14:12.27575107 +0000 UTC m=+7918.850495930" watchObservedRunningTime="2026-03-18 11:14:12.288512747 +0000 UTC m=+7918.863257597" Mar 18 11:14:13 crc kubenswrapper[4778]: I0318 11:14:13.518581 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:13 crc kubenswrapper[4778]: I0318 11:14:13.518906 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:14 crc kubenswrapper[4778]: I0318 11:14:14.567898 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5fvxm" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="registry-server" probeResult="failure" output=< Mar 18 11:14:14 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Mar 18 11:14:14 crc kubenswrapper[4778]: > Mar 18 11:14:22 crc kubenswrapper[4778]: E0318 11:14:22.667654 4778 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.70:42540->38.102.83.70:35463: write tcp 38.102.83.70:42540->38.102.83.70:35463: write: broken pipe Mar 18 11:14:23 crc kubenswrapper[4778]: I0318 11:14:23.187901 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:14:23 crc kubenswrapper[4778]: E0318 11:14:23.188533 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:14:23 crc kubenswrapper[4778]: I0318 11:14:23.568528 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:23 crc kubenswrapper[4778]: I0318 11:14:23.626502 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:23 crc kubenswrapper[4778]: I0318 11:14:23.812893 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.333778 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5fvxm" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="registry-server" containerID="cri-o://2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84" gracePeriod=2 Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.885830 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.965781 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content\") pod \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.965905 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f8tm\" (UniqueName: \"kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm\") pod \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.965978 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities\") pod \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\" (UID: \"3c19e2f3-650b-4de7-8f71-f4ea6631c79c\") " Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.967456 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities" (OuterVolumeSpecName: "utilities") pod "3c19e2f3-650b-4de7-8f71-f4ea6631c79c" (UID: "3c19e2f3-650b-4de7-8f71-f4ea6631c79c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:14:25 crc kubenswrapper[4778]: I0318 11:14:25.972501 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm" (OuterVolumeSpecName: "kube-api-access-9f8tm") pod "3c19e2f3-650b-4de7-8f71-f4ea6631c79c" (UID: "3c19e2f3-650b-4de7-8f71-f4ea6631c79c"). InnerVolumeSpecName "kube-api-access-9f8tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.072826 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f8tm\" (UniqueName: \"kubernetes.io/projected/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-kube-api-access-9f8tm\") on node \"crc\" DevicePath \"\"" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.072859 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.115286 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c19e2f3-650b-4de7-8f71-f4ea6631c79c" (UID: "3c19e2f3-650b-4de7-8f71-f4ea6631c79c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.174690 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c19e2f3-650b-4de7-8f71-f4ea6631c79c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.345932 4778 generic.go:334] "Generic (PLEG): container finished" podID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerID="2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84" exitCode=0 Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.345976 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerDied","Data":"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84"} Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.346001 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fvxm" event={"ID":"3c19e2f3-650b-4de7-8f71-f4ea6631c79c","Type":"ContainerDied","Data":"eceaba1cadc94da3645fb8bab4d0ecd5290d1d81ac72c2392e76e15fcc8275b3"} Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.346020 4778 scope.go:117] "RemoveContainer" containerID="2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.346147 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fvxm" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.374560 4778 scope.go:117] "RemoveContainer" containerID="804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.378489 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.390006 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5fvxm"] Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.434921 4778 scope.go:117] "RemoveContainer" containerID="18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.493676 4778 scope.go:117] "RemoveContainer" containerID="2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84" Mar 18 11:14:26 crc kubenswrapper[4778]: E0318 11:14:26.494676 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84\": container with ID starting with 2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84 not found: ID does not exist" containerID="2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.494711 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84"} err="failed to get container status \"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84\": rpc error: code = NotFound desc = could not find container \"2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84\": container with ID starting with 2ac07ab62ac0a9adfd2344b3bfbbb81fbac963ae11bf58569ebe607d540aae84 not found: ID does not exist" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.494727 4778 scope.go:117] "RemoveContainer" containerID="804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182" Mar 18 11:14:26 crc kubenswrapper[4778]: E0318 11:14:26.496255 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182\": container with ID starting with 804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182 not found: ID does not exist" containerID="804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.496282 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182"} err="failed to get container status \"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182\": rpc error: code = NotFound desc = could not find container \"804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182\": container with ID starting with 804f591b3fe9dccc2cb37c45fd4b65887e293fc6a8c481be389bb6d97d31b182 not found: ID does not exist" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.496296 4778 scope.go:117] "RemoveContainer" containerID="18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e" Mar 18 11:14:26 crc kubenswrapper[4778]: E0318 11:14:26.500321 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e\": container with ID starting with 18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e not found: ID does not exist" containerID="18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e" Mar 18 11:14:26 crc kubenswrapper[4778]: I0318 11:14:26.500366 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e"} err="failed to get container status \"18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e\": rpc error: code = NotFound desc = could not find container \"18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e\": container with ID starting with 18c4e0864d1c3a856c065701ac864fb3ba1866d21823cfb3a7d7f25483c6d84e not found: ID does not exist" Mar 18 11:14:28 crc kubenswrapper[4778]: I0318 11:14:28.200489 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" path="/var/lib/kubelet/pods/3c19e2f3-650b-4de7-8f71-f4ea6631c79c/volumes" Mar 18 11:14:37 crc kubenswrapper[4778]: I0318 11:14:37.187240 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:14:37 crc kubenswrapper[4778]: E0318 11:14:37.188415 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:14:38 crc kubenswrapper[4778]: I0318 11:14:38.837985 4778 scope.go:117] "RemoveContainer" containerID="ecc2f8a6686d5391d07b662f53f7a3bdd9927adf67509a229601871555c0b456" Mar 18 11:14:52 crc kubenswrapper[4778]: I0318 11:14:52.189115 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:14:52 crc kubenswrapper[4778]: E0318 11:14:52.189975 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:14:53 crc kubenswrapper[4778]: E0318 11:14:53.187432 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.171765 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg"] Mar 18 11:15:00 crc kubenswrapper[4778]: E0318 11:15:00.173216 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="registry-server" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.173242 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="registry-server" Mar 18 11:15:00 crc kubenswrapper[4778]: E0318 11:15:00.173300 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="extract-utilities" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.173313 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="extract-utilities" Mar 18 11:15:00 crc kubenswrapper[4778]: E0318 11:15:00.173384 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894be30f-4dc9-4a4c-b443-2393b89df180" containerName="oc" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.173397 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="894be30f-4dc9-4a4c-b443-2393b89df180" containerName="oc" Mar 18 11:15:00 crc kubenswrapper[4778]: E0318 11:15:00.173441 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="extract-content" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.173452 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="extract-content" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.173992 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c19e2f3-650b-4de7-8f71-f4ea6631c79c" containerName="registry-server" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.174054 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="894be30f-4dc9-4a4c-b443-2393b89df180" containerName="oc" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.175187 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.178167 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.179022 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.202693 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg"] Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.276279 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.276336 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5prz\" (UniqueName: \"kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.276930 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.379805 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.379970 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.380007 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5prz\" (UniqueName: \"kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.383170 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.404065 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.409678 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5prz\" (UniqueName: \"kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz\") pod \"collect-profiles-29563875-4mxkg\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.500372 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:00 crc kubenswrapper[4778]: I0318 11:15:00.974377 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg"] Mar 18 11:15:00 crc kubenswrapper[4778]: W0318 11:15:00.980812 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ef320b3_caf2_4ff6_aa7e_1a5e059effff.slice/crio-01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80 WatchSource:0}: Error finding container 01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80: Status 404 returned error can't find the container with id 01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80 Mar 18 11:15:01 crc kubenswrapper[4778]: I0318 11:15:01.719657 4778 generic.go:334] "Generic (PLEG): container finished" podID="5ef320b3-caf2-4ff6-aa7e-1a5e059effff" containerID="83ccfb7a3c0a73cfc5ac20a3b0e1058355f9ff7ce52c96b3182b319f43df6e02" exitCode=0 Mar 18 11:15:01 crc kubenswrapper[4778]: I0318 11:15:01.719917 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" event={"ID":"5ef320b3-caf2-4ff6-aa7e-1a5e059effff","Type":"ContainerDied","Data":"83ccfb7a3c0a73cfc5ac20a3b0e1058355f9ff7ce52c96b3182b319f43df6e02"} Mar 18 11:15:01 crc kubenswrapper[4778]: I0318 11:15:01.720110 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" event={"ID":"5ef320b3-caf2-4ff6-aa7e-1a5e059effff","Type":"ContainerStarted","Data":"01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80"} Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.145998 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.162949 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume\") pod \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.163052 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5prz\" (UniqueName: \"kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz\") pod \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.163408 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume\") pod \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\" (UID: \"5ef320b3-caf2-4ff6-aa7e-1a5e059effff\") " Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.167327 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume" (OuterVolumeSpecName: "config-volume") pod "5ef320b3-caf2-4ff6-aa7e-1a5e059effff" (UID: "5ef320b3-caf2-4ff6-aa7e-1a5e059effff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.171802 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz" (OuterVolumeSpecName: "kube-api-access-l5prz") pod "5ef320b3-caf2-4ff6-aa7e-1a5e059effff" (UID: "5ef320b3-caf2-4ff6-aa7e-1a5e059effff"). InnerVolumeSpecName "kube-api-access-l5prz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.182524 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5ef320b3-caf2-4ff6-aa7e-1a5e059effff" (UID: "5ef320b3-caf2-4ff6-aa7e-1a5e059effff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.266834 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.266882 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5prz\" (UniqueName: \"kubernetes.io/projected/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-kube-api-access-l5prz\") on node \"crc\" DevicePath \"\"" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.266897 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5ef320b3-caf2-4ff6-aa7e-1a5e059effff-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.739212 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" event={"ID":"5ef320b3-caf2-4ff6-aa7e-1a5e059effff","Type":"ContainerDied","Data":"01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80"} Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.739252 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01f33ed17e5733897772695cc06cbddfe938fe839de0ab8a747f8a2f760c3c80" Mar 18 11:15:03 crc kubenswrapper[4778]: I0318 11:15:03.739311 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563875-4mxkg" Mar 18 11:15:04 crc kubenswrapper[4778]: I0318 11:15:04.232280 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m"] Mar 18 11:15:04 crc kubenswrapper[4778]: I0318 11:15:04.243166 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-9w24m"] Mar 18 11:15:06 crc kubenswrapper[4778]: I0318 11:15:06.201980 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="688101ed-133b-42c6-87f0-fb2ce2afa33f" path="/var/lib/kubelet/pods/688101ed-133b-42c6-87f0-fb2ce2afa33f/volumes" Mar 18 11:15:07 crc kubenswrapper[4778]: I0318 11:15:07.187064 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:15:07 crc kubenswrapper[4778]: E0318 11:15:07.187584 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:15:22 crc kubenswrapper[4778]: I0318 11:15:22.187534 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:15:22 crc kubenswrapper[4778]: E0318 11:15:22.188436 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:15:37 crc kubenswrapper[4778]: I0318 11:15:37.188784 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:15:37 crc kubenswrapper[4778]: E0318 11:15:37.189714 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:15:38 crc kubenswrapper[4778]: I0318 11:15:38.923320 4778 scope.go:117] "RemoveContainer" containerID="91439ddaf1c7b64a7912887de697803bd3f4ff4a97a1ee187c7b7ad2914b7556" Mar 18 11:15:49 crc kubenswrapper[4778]: I0318 11:15:49.188293 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:15:49 crc kubenswrapper[4778]: E0318 11:15:49.189258 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.181010 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563876-sl728"] Mar 18 11:16:00 crc kubenswrapper[4778]: E0318 11:16:00.182057 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef320b3-caf2-4ff6-aa7e-1a5e059effff" containerName="collect-profiles" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.182074 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef320b3-caf2-4ff6-aa7e-1a5e059effff" containerName="collect-profiles" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.182338 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef320b3-caf2-4ff6-aa7e-1a5e059effff" containerName="collect-profiles" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.183167 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.189093 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.189322 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.189551 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.208055 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563876-sl728"] Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.332687 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2h2d\" (UniqueName: \"kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d\") pod \"auto-csr-approver-29563876-sl728\" (UID: \"f1897e5a-c532-4379-9a46-ad5355a45122\") " pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.434471 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2h2d\" (UniqueName: \"kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d\") pod \"auto-csr-approver-29563876-sl728\" (UID: \"f1897e5a-c532-4379-9a46-ad5355a45122\") " pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.458639 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2h2d\" (UniqueName: \"kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d\") pod \"auto-csr-approver-29563876-sl728\" (UID: \"f1897e5a-c532-4379-9a46-ad5355a45122\") " pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.508142 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:00 crc kubenswrapper[4778]: I0318 11:16:00.980882 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563876-sl728"] Mar 18 11:16:01 crc kubenswrapper[4778]: I0318 11:16:01.358289 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563876-sl728" event={"ID":"f1897e5a-c532-4379-9a46-ad5355a45122","Type":"ContainerStarted","Data":"9d38eeb743b07219d50246a19bcd998b3678d8cc4be7099190b24f6d4a03bec3"} Mar 18 11:16:02 crc kubenswrapper[4778]: E0318 11:16:02.189117 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:16:03 crc kubenswrapper[4778]: I0318 11:16:03.381349 4778 generic.go:334] "Generic (PLEG): container finished" podID="f1897e5a-c532-4379-9a46-ad5355a45122" containerID="149d8c7a5b7b3e6ba4cd600a7a38eec0eda785a24c394741cb2942187806b242" exitCode=0 Mar 18 11:16:03 crc kubenswrapper[4778]: I0318 11:16:03.381674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563876-sl728" event={"ID":"f1897e5a-c532-4379-9a46-ad5355a45122","Type":"ContainerDied","Data":"149d8c7a5b7b3e6ba4cd600a7a38eec0eda785a24c394741cb2942187806b242"} Mar 18 11:16:04 crc kubenswrapper[4778]: I0318 11:16:04.198080 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:16:04 crc kubenswrapper[4778]: E0318 11:16:04.198647 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:16:04 crc kubenswrapper[4778]: I0318 11:16:04.739472 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:04 crc kubenswrapper[4778]: I0318 11:16:04.830280 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2h2d\" (UniqueName: \"kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d\") pod \"f1897e5a-c532-4379-9a46-ad5355a45122\" (UID: \"f1897e5a-c532-4379-9a46-ad5355a45122\") " Mar 18 11:16:04 crc kubenswrapper[4778]: I0318 11:16:04.835155 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d" (OuterVolumeSpecName: "kube-api-access-z2h2d") pod "f1897e5a-c532-4379-9a46-ad5355a45122" (UID: "f1897e5a-c532-4379-9a46-ad5355a45122"). InnerVolumeSpecName "kube-api-access-z2h2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:16:04 crc kubenswrapper[4778]: I0318 11:16:04.935696 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2h2d\" (UniqueName: \"kubernetes.io/projected/f1897e5a-c532-4379-9a46-ad5355a45122-kube-api-access-z2h2d\") on node \"crc\" DevicePath \"\"" Mar 18 11:16:05 crc kubenswrapper[4778]: I0318 11:16:05.411351 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563876-sl728" event={"ID":"f1897e5a-c532-4379-9a46-ad5355a45122","Type":"ContainerDied","Data":"9d38eeb743b07219d50246a19bcd998b3678d8cc4be7099190b24f6d4a03bec3"} Mar 18 11:16:05 crc kubenswrapper[4778]: I0318 11:16:05.411688 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d38eeb743b07219d50246a19bcd998b3678d8cc4be7099190b24f6d4a03bec3" Mar 18 11:16:05 crc kubenswrapper[4778]: I0318 11:16:05.411529 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563876-sl728" Mar 18 11:16:05 crc kubenswrapper[4778]: I0318 11:16:05.842513 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563870-s7lhp"] Mar 18 11:16:05 crc kubenswrapper[4778]: I0318 11:16:05.855633 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563870-s7lhp"] Mar 18 11:16:06 crc kubenswrapper[4778]: I0318 11:16:06.199321 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08809b1c-c749-4734-9fc4-6a0a755aa9cd" path="/var/lib/kubelet/pods/08809b1c-c749-4734-9fc4-6a0a755aa9cd/volumes" Mar 18 11:16:19 crc kubenswrapper[4778]: I0318 11:16:19.188106 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:16:19 crc kubenswrapper[4778]: E0318 11:16:19.189058 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:16:23 crc kubenswrapper[4778]: I0318 11:16:23.596659 4778 generic.go:334] "Generic (PLEG): container finished" podID="339d23a2-4cea-4331-b745-44219b471d41" containerID="376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0" exitCode=0 Mar 18 11:16:23 crc kubenswrapper[4778]: I0318 11:16:23.596796 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v7jl/must-gather-8j576" event={"ID":"339d23a2-4cea-4331-b745-44219b471d41","Type":"ContainerDied","Data":"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0"} Mar 18 11:16:23 crc kubenswrapper[4778]: I0318 11:16:23.598420 4778 scope.go:117] "RemoveContainer" containerID="376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0" Mar 18 11:16:24 crc kubenswrapper[4778]: I0318 11:16:24.187283 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4v7jl_must-gather-8j576_339d23a2-4cea-4331-b745-44219b471d41/gather/0.log" Mar 18 11:16:32 crc kubenswrapper[4778]: I0318 11:16:32.188229 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:16:32 crc kubenswrapper[4778]: E0318 11:16:32.189607 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:16:35 crc kubenswrapper[4778]: I0318 11:16:35.601047 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4v7jl/must-gather-8j576"] Mar 18 11:16:35 crc kubenswrapper[4778]: I0318 11:16:35.601879 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4v7jl/must-gather-8j576" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="copy" containerID="cri-o://07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310" gracePeriod=2 Mar 18 11:16:35 crc kubenswrapper[4778]: I0318 11:16:35.611919 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4v7jl/must-gather-8j576"] Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.028763 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4v7jl_must-gather-8j576_339d23a2-4cea-4331-b745-44219b471d41/copy/0.log" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.029519 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.148764 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjcph\" (UniqueName: \"kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph\") pod \"339d23a2-4cea-4331-b745-44219b471d41\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.149277 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output\") pod \"339d23a2-4cea-4331-b745-44219b471d41\" (UID: \"339d23a2-4cea-4331-b745-44219b471d41\") " Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.154712 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph" (OuterVolumeSpecName: "kube-api-access-cjcph") pod "339d23a2-4cea-4331-b745-44219b471d41" (UID: "339d23a2-4cea-4331-b745-44219b471d41"). InnerVolumeSpecName "kube-api-access-cjcph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.251775 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjcph\" (UniqueName: \"kubernetes.io/projected/339d23a2-4cea-4331-b745-44219b471d41-kube-api-access-cjcph\") on node \"crc\" DevicePath \"\"" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.337110 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "339d23a2-4cea-4331-b745-44219b471d41" (UID: "339d23a2-4cea-4331-b745-44219b471d41"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.353563 4778 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/339d23a2-4cea-4331-b745-44219b471d41-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.784377 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4v7jl_must-gather-8j576_339d23a2-4cea-4331-b745-44219b471d41/copy/0.log" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.784724 4778 generic.go:334] "Generic (PLEG): container finished" podID="339d23a2-4cea-4331-b745-44219b471d41" containerID="07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310" exitCode=143 Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.784765 4778 scope.go:117] "RemoveContainer" containerID="07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.784878 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v7jl/must-gather-8j576" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.817547 4778 scope.go:117] "RemoveContainer" containerID="376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.885694 4778 scope.go:117] "RemoveContainer" containerID="07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310" Mar 18 11:16:36 crc kubenswrapper[4778]: E0318 11:16:36.886111 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310\": container with ID starting with 07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310 not found: ID does not exist" containerID="07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.886140 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310"} err="failed to get container status \"07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310\": rpc error: code = NotFound desc = could not find container \"07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310\": container with ID starting with 07c61ae0df4c4ec4035051a0490bff456734218dc4ec1c30fd5c71fabc9f0310 not found: ID does not exist" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.886160 4778 scope.go:117] "RemoveContainer" containerID="376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0" Mar 18 11:16:36 crc kubenswrapper[4778]: E0318 11:16:36.886695 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0\": container with ID starting with 376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0 not found: ID does not exist" containerID="376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0" Mar 18 11:16:36 crc kubenswrapper[4778]: I0318 11:16:36.886722 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0"} err="failed to get container status \"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0\": rpc error: code = NotFound desc = could not find container \"376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0\": container with ID starting with 376b8f7ae28b77095cb76bc27b6cb631ab1a9dc132fe02cc9663aced7d9932e0 not found: ID does not exist" Mar 18 11:16:38 crc kubenswrapper[4778]: I0318 11:16:38.198858 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="339d23a2-4cea-4331-b745-44219b471d41" path="/var/lib/kubelet/pods/339d23a2-4cea-4331-b745-44219b471d41/volumes" Mar 18 11:16:39 crc kubenswrapper[4778]: I0318 11:16:39.002937 4778 scope.go:117] "RemoveContainer" containerID="b0c173a65daa3d6727011d9a85b81569efd5870086c307d8ad02c5186b648e01" Mar 18 11:16:39 crc kubenswrapper[4778]: I0318 11:16:39.053005 4778 scope.go:117] "RemoveContainer" containerID="7417bcdf486fe4210bba0dca5e997eafe86b7f08ceaf37548fb4760f00212acc" Mar 18 11:16:47 crc kubenswrapper[4778]: I0318 11:16:47.187774 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:16:47 crc kubenswrapper[4778]: E0318 11:16:47.188531 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:17:01 crc kubenswrapper[4778]: I0318 11:17:01.187228 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:17:01 crc kubenswrapper[4778]: E0318 11:17:01.188006 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:17:08 crc kubenswrapper[4778]: E0318 11:17:08.189387 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:17:12 crc kubenswrapper[4778]: I0318 11:17:12.187426 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:17:12 crc kubenswrapper[4778]: E0318 11:17:12.188482 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.868751 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:19 crc kubenswrapper[4778]: E0318 11:17:19.869730 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="copy" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869744 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="copy" Mar 18 11:17:19 crc kubenswrapper[4778]: E0318 11:17:19.869771 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="gather" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869777 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="gather" Mar 18 11:17:19 crc kubenswrapper[4778]: E0318 11:17:19.869793 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1897e5a-c532-4379-9a46-ad5355a45122" containerName="oc" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869799 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1897e5a-c532-4379-9a46-ad5355a45122" containerName="oc" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869970 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="gather" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869983 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="339d23a2-4cea-4331-b745-44219b471d41" containerName="copy" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.869997 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1897e5a-c532-4379-9a46-ad5355a45122" containerName="oc" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.871312 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:19 crc kubenswrapper[4778]: I0318 11:17:19.882406 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.058079 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.058308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.058386 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrwdk\" (UniqueName: \"kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.160649 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrwdk\" (UniqueName: \"kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.160833 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.160871 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.161341 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.161910 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.196519 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrwdk\" (UniqueName: \"kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk\") pod \"redhat-marketplace-qlv6x\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.200811 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:20 crc kubenswrapper[4778]: I0318 11:17:20.735995 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:21 crc kubenswrapper[4778]: I0318 11:17:21.253428 4778 generic.go:334] "Generic (PLEG): container finished" podID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerID="fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465" exitCode=0 Mar 18 11:17:21 crc kubenswrapper[4778]: I0318 11:17:21.253535 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerDied","Data":"fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465"} Mar 18 11:17:21 crc kubenswrapper[4778]: I0318 11:17:21.253739 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerStarted","Data":"c977c1845e4effe2186b040420fcb1d302a3c18a5cc4ab73442c03de746df12c"} Mar 18 11:17:21 crc kubenswrapper[4778]: I0318 11:17:21.256437 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 11:17:23 crc kubenswrapper[4778]: I0318 11:17:23.281125 4778 generic.go:334] "Generic (PLEG): container finished" podID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerID="85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202" exitCode=0 Mar 18 11:17:23 crc kubenswrapper[4778]: I0318 11:17:23.281183 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerDied","Data":"85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202"} Mar 18 11:17:24 crc kubenswrapper[4778]: I0318 11:17:24.294031 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerStarted","Data":"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e"} Mar 18 11:17:24 crc kubenswrapper[4778]: I0318 11:17:24.314215 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qlv6x" podStartSLOduration=2.626412597 podStartE2EDuration="5.314176213s" podCreationTimestamp="2026-03-18 11:17:19 +0000 UTC" firstStartedPulling="2026-03-18 11:17:21.256153745 +0000 UTC m=+8107.830898595" lastFinishedPulling="2026-03-18 11:17:23.943917331 +0000 UTC m=+8110.518662211" observedRunningTime="2026-03-18 11:17:24.313543076 +0000 UTC m=+8110.888287926" watchObservedRunningTime="2026-03-18 11:17:24.314176213 +0000 UTC m=+8110.888921053" Mar 18 11:17:25 crc kubenswrapper[4778]: I0318 11:17:25.187546 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:17:25 crc kubenswrapper[4778]: E0318 11:17:25.187806 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:17:30 crc kubenswrapper[4778]: I0318 11:17:30.206373 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:30 crc kubenswrapper[4778]: I0318 11:17:30.207024 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:30 crc kubenswrapper[4778]: I0318 11:17:30.287324 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:30 crc kubenswrapper[4778]: I0318 11:17:30.443524 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:30 crc kubenswrapper[4778]: I0318 11:17:30.537006 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:32 crc kubenswrapper[4778]: I0318 11:17:32.380343 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qlv6x" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="registry-server" containerID="cri-o://fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e" gracePeriod=2 Mar 18 11:17:32 crc kubenswrapper[4778]: I0318 11:17:32.883695 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.070689 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content\") pod \"05b76f7a-111e-4d55-bf5d-300863cd06d7\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.071311 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities\") pod \"05b76f7a-111e-4d55-bf5d-300863cd06d7\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.071362 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrwdk\" (UniqueName: \"kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk\") pod \"05b76f7a-111e-4d55-bf5d-300863cd06d7\" (UID: \"05b76f7a-111e-4d55-bf5d-300863cd06d7\") " Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.073004 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities" (OuterVolumeSpecName: "utilities") pod "05b76f7a-111e-4d55-bf5d-300863cd06d7" (UID: "05b76f7a-111e-4d55-bf5d-300863cd06d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.080611 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk" (OuterVolumeSpecName: "kube-api-access-nrwdk") pod "05b76f7a-111e-4d55-bf5d-300863cd06d7" (UID: "05b76f7a-111e-4d55-bf5d-300863cd06d7"). InnerVolumeSpecName "kube-api-access-nrwdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.107707 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05b76f7a-111e-4d55-bf5d-300863cd06d7" (UID: "05b76f7a-111e-4d55-bf5d-300863cd06d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.174012 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.174055 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05b76f7a-111e-4d55-bf5d-300863cd06d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.174069 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrwdk\" (UniqueName: \"kubernetes.io/projected/05b76f7a-111e-4d55-bf5d-300863cd06d7-kube-api-access-nrwdk\") on node \"crc\" DevicePath \"\"" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.401127 4778 generic.go:334] "Generic (PLEG): container finished" podID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerID="fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e" exitCode=0 Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.401212 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerDied","Data":"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e"} Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.401250 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlv6x" event={"ID":"05b76f7a-111e-4d55-bf5d-300863cd06d7","Type":"ContainerDied","Data":"c977c1845e4effe2186b040420fcb1d302a3c18a5cc4ab73442c03de746df12c"} Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.401263 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlv6x" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.401274 4778 scope.go:117] "RemoveContainer" containerID="fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.437949 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.438587 4778 scope.go:117] "RemoveContainer" containerID="85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.446221 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlv6x"] Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.457129 4778 scope.go:117] "RemoveContainer" containerID="fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.494062 4778 scope.go:117] "RemoveContainer" containerID="fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e" Mar 18 11:17:33 crc kubenswrapper[4778]: E0318 11:17:33.494683 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e\": container with ID starting with fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e not found: ID does not exist" containerID="fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.494721 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e"} err="failed to get container status \"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e\": rpc error: code = NotFound desc = could not find container \"fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e\": container with ID starting with fc3b1a8f1f5b49e5d018a1dcc54f7d769b58f1ec210edc89d82415e749eb647e not found: ID does not exist" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.494745 4778 scope.go:117] "RemoveContainer" containerID="85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202" Mar 18 11:17:33 crc kubenswrapper[4778]: E0318 11:17:33.494985 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202\": container with ID starting with 85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202 not found: ID does not exist" containerID="85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.495012 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202"} err="failed to get container status \"85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202\": rpc error: code = NotFound desc = could not find container \"85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202\": container with ID starting with 85c491d94ffc01ca2be295ed67a72e2f2ea97d24536a1cdc5ca795da7e653202 not found: ID does not exist" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.495029 4778 scope.go:117] "RemoveContainer" containerID="fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465" Mar 18 11:17:33 crc kubenswrapper[4778]: E0318 11:17:33.495328 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465\": container with ID starting with fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465 not found: ID does not exist" containerID="fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465" Mar 18 11:17:33 crc kubenswrapper[4778]: I0318 11:17:33.495352 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465"} err="failed to get container status \"fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465\": rpc error: code = NotFound desc = could not find container \"fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465\": container with ID starting with fc9973dca299ce93a87154db7789ccb11bf55e03ccbed629f3600c4d1cacd465 not found: ID does not exist" Mar 18 11:17:34 crc kubenswrapper[4778]: I0318 11:17:34.202710 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" path="/var/lib/kubelet/pods/05b76f7a-111e-4d55-bf5d-300863cd06d7/volumes" Mar 18 11:17:39 crc kubenswrapper[4778]: I0318 11:17:39.192400 4778 scope.go:117] "RemoveContainer" containerID="4244424fcad241de0fa7ed0597ece19cff3267ef03d0dbab6082ae06fe156bfa" Mar 18 11:17:40 crc kubenswrapper[4778]: I0318 11:17:40.188081 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:17:40 crc kubenswrapper[4778]: E0318 11:17:40.188856 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:17:52 crc kubenswrapper[4778]: I0318 11:17:52.187449 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:17:52 crc kubenswrapper[4778]: E0318 11:17:52.188244 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.159087 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563878-2kwvn"] Mar 18 11:18:00 crc kubenswrapper[4778]: E0318 11:18:00.159976 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="registry-server" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.159989 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="registry-server" Mar 18 11:18:00 crc kubenswrapper[4778]: E0318 11:18:00.160029 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="extract-utilities" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.160034 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="extract-utilities" Mar 18 11:18:00 crc kubenswrapper[4778]: E0318 11:18:00.160047 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="extract-content" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.160052 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="extract-content" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.160233 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="05b76f7a-111e-4d55-bf5d-300863cd06d7" containerName="registry-server" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.160872 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.163718 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.163903 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.164284 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.170236 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563878-2kwvn"] Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.276404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2668\" (UniqueName: \"kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668\") pod \"auto-csr-approver-29563878-2kwvn\" (UID: \"a6b4eda3-c3f7-40e9-8f26-d82054654c49\") " pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.379002 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2668\" (UniqueName: \"kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668\") pod \"auto-csr-approver-29563878-2kwvn\" (UID: \"a6b4eda3-c3f7-40e9-8f26-d82054654c49\") " pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.402540 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2668\" (UniqueName: \"kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668\") pod \"auto-csr-approver-29563878-2kwvn\" (UID: \"a6b4eda3-c3f7-40e9-8f26-d82054654c49\") " pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:00 crc kubenswrapper[4778]: I0318 11:18:00.502472 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:01 crc kubenswrapper[4778]: I0318 11:18:01.008971 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563878-2kwvn"] Mar 18 11:18:01 crc kubenswrapper[4778]: I0318 11:18:01.692325 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" event={"ID":"a6b4eda3-c3f7-40e9-8f26-d82054654c49","Type":"ContainerStarted","Data":"4cf717326dcf434c1ecca449dd2dce3b09a4c455d622981e4995c6c45b4ee5e4"} Mar 18 11:18:02 crc kubenswrapper[4778]: I0318 11:18:02.704295 4778 generic.go:334] "Generic (PLEG): container finished" podID="a6b4eda3-c3f7-40e9-8f26-d82054654c49" containerID="69d660b1da69b29e026550fe6c9425fe0d4cebfcef6d2a0c10358283a2466818" exitCode=0 Mar 18 11:18:02 crc kubenswrapper[4778]: I0318 11:18:02.704373 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" event={"ID":"a6b4eda3-c3f7-40e9-8f26-d82054654c49","Type":"ContainerDied","Data":"69d660b1da69b29e026550fe6c9425fe0d4cebfcef6d2a0c10358283a2466818"} Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.062093 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.154418 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2668\" (UniqueName: \"kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668\") pod \"a6b4eda3-c3f7-40e9-8f26-d82054654c49\" (UID: \"a6b4eda3-c3f7-40e9-8f26-d82054654c49\") " Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.175102 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668" (OuterVolumeSpecName: "kube-api-access-q2668") pod "a6b4eda3-c3f7-40e9-8f26-d82054654c49" (UID: "a6b4eda3-c3f7-40e9-8f26-d82054654c49"). InnerVolumeSpecName "kube-api-access-q2668". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.193278 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:18:04 crc kubenswrapper[4778]: E0318 11:18:04.193749 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.256911 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2668\" (UniqueName: \"kubernetes.io/projected/a6b4eda3-c3f7-40e9-8f26-d82054654c49-kube-api-access-q2668\") on node \"crc\" DevicePath \"\"" Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.730043 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" event={"ID":"a6b4eda3-c3f7-40e9-8f26-d82054654c49","Type":"ContainerDied","Data":"4cf717326dcf434c1ecca449dd2dce3b09a4c455d622981e4995c6c45b4ee5e4"} Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.730096 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cf717326dcf434c1ecca449dd2dce3b09a4c455d622981e4995c6c45b4ee5e4" Mar 18 11:18:04 crc kubenswrapper[4778]: I0318 11:18:04.730626 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563878-2kwvn" Mar 18 11:18:05 crc kubenswrapper[4778]: I0318 11:18:05.155095 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563872-w9xc8"] Mar 18 11:18:05 crc kubenswrapper[4778]: I0318 11:18:05.167372 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563872-w9xc8"] Mar 18 11:18:06 crc kubenswrapper[4778]: I0318 11:18:06.200631 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2575542-201b-40c8-baec-f64e53f357a6" path="/var/lib/kubelet/pods/c2575542-201b-40c8-baec-f64e53f357a6/volumes" Mar 18 11:18:16 crc kubenswrapper[4778]: I0318 11:18:16.186866 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:18:16 crc kubenswrapper[4778]: E0318 11:18:16.188493 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56rc7_openshift-machine-config-operator(7243f983-24d5-48ef-858b-5f4049a82acc)\"" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" Mar 18 11:18:30 crc kubenswrapper[4778]: I0318 11:18:30.188706 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" Mar 18 11:18:30 crc kubenswrapper[4778]: I0318 11:18:30.973843 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"6b0ac993b4f24855e183f727279ee7234cc746a4cd9d755c93e21afd7bbad379"} Mar 18 11:18:37 crc kubenswrapper[4778]: E0318 11:18:37.187842 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:18:39 crc kubenswrapper[4778]: I0318 11:18:39.323714 4778 scope.go:117] "RemoveContainer" containerID="9b9c4586ce364f21cf8a583a2a00575bf65854ca35b9f67e292350a899db8fd9" Mar 18 11:19:42 crc kubenswrapper[4778]: E0318 11:19:42.187855 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.149133 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563880-7p9cl"] Mar 18 11:20:00 crc kubenswrapper[4778]: E0318 11:20:00.150723 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b4eda3-c3f7-40e9-8f26-d82054654c49" containerName="oc" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.150750 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b4eda3-c3f7-40e9-8f26-d82054654c49" containerName="oc" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.151136 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b4eda3-c3f7-40e9-8f26-d82054654c49" containerName="oc" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.152344 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.154736 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g2kb6" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.155325 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.155433 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.160787 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563880-7p9cl"] Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.348338 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq6dx\" (UniqueName: \"kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx\") pod \"auto-csr-approver-29563880-7p9cl\" (UID: \"d5eeffef-3ac0-4175-a48f-988f221fdb87\") " pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.450298 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq6dx\" (UniqueName: \"kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx\") pod \"auto-csr-approver-29563880-7p9cl\" (UID: \"d5eeffef-3ac0-4175-a48f-988f221fdb87\") " pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.469286 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq6dx\" (UniqueName: \"kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx\") pod \"auto-csr-approver-29563880-7p9cl\" (UID: \"d5eeffef-3ac0-4175-a48f-988f221fdb87\") " pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.491978 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:00 crc kubenswrapper[4778]: I0318 11:20:00.964091 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563880-7p9cl"] Mar 18 11:20:01 crc kubenswrapper[4778]: I0318 11:20:01.891772 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" event={"ID":"d5eeffef-3ac0-4175-a48f-988f221fdb87","Type":"ContainerStarted","Data":"dcc9fc95bcb5cd51a57987a594274b357968befc77a0e082a721ce5016fb5b87"} Mar 18 11:20:02 crc kubenswrapper[4778]: I0318 11:20:02.903498 4778 generic.go:334] "Generic (PLEG): container finished" podID="d5eeffef-3ac0-4175-a48f-988f221fdb87" containerID="ecc7767e917cec854c2c16c99b0004e4c693491ad73089f93315d55e813c4597" exitCode=0 Mar 18 11:20:02 crc kubenswrapper[4778]: I0318 11:20:02.903584 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" event={"ID":"d5eeffef-3ac0-4175-a48f-988f221fdb87","Type":"ContainerDied","Data":"ecc7767e917cec854c2c16c99b0004e4c693491ad73089f93315d55e813c4597"} Mar 18 11:20:03 crc kubenswrapper[4778]: E0318 11:20:02.999970 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5eeffef_3ac0_4175_a48f_988f221fdb87.slice/crio-ecc7767e917cec854c2c16c99b0004e4c693491ad73089f93315d55e813c4597.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5eeffef_3ac0_4175_a48f_988f221fdb87.slice/crio-conmon-ecc7767e917cec854c2c16c99b0004e4c693491ad73089f93315d55e813c4597.scope\": RecentStats: unable to find data in memory cache]" Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.345214 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.431091 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq6dx\" (UniqueName: \"kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx\") pod \"d5eeffef-3ac0-4175-a48f-988f221fdb87\" (UID: \"d5eeffef-3ac0-4175-a48f-988f221fdb87\") " Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.438537 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx" (OuterVolumeSpecName: "kube-api-access-nq6dx") pod "d5eeffef-3ac0-4175-a48f-988f221fdb87" (UID: "d5eeffef-3ac0-4175-a48f-988f221fdb87"). InnerVolumeSpecName "kube-api-access-nq6dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.533672 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq6dx\" (UniqueName: \"kubernetes.io/projected/d5eeffef-3ac0-4175-a48f-988f221fdb87-kube-api-access-nq6dx\") on node \"crc\" DevicePath \"\"" Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.928018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" event={"ID":"d5eeffef-3ac0-4175-a48f-988f221fdb87","Type":"ContainerDied","Data":"dcc9fc95bcb5cd51a57987a594274b357968befc77a0e082a721ce5016fb5b87"} Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.928060 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcc9fc95bcb5cd51a57987a594274b357968befc77a0e082a721ce5016fb5b87" Mar 18 11:20:04 crc kubenswrapper[4778]: I0318 11:20:04.928064 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563880-7p9cl" Mar 18 11:20:05 crc kubenswrapper[4778]: I0318 11:20:05.470019 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563874-hj4l4"] Mar 18 11:20:05 crc kubenswrapper[4778]: I0318 11:20:05.486645 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563874-hj4l4"] Mar 18 11:20:06 crc kubenswrapper[4778]: I0318 11:20:06.212866 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="894be30f-4dc9-4a4c-b443-2393b89df180" path="/var/lib/kubelet/pods/894be30f-4dc9-4a4c-b443-2393b89df180/volumes" Mar 18 11:20:30 crc kubenswrapper[4778]: I0318 11:20:30.148021 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:20:30 crc kubenswrapper[4778]: I0318 11:20:30.149105 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:20:39 crc kubenswrapper[4778]: I0318 11:20:39.414039 4778 scope.go:117] "RemoveContainer" containerID="8d49e3ed1fe8885369ea3edca8ac073df8e6a13bd130a54cf891e878c3833fb8" Mar 18 11:21:00 crc kubenswrapper[4778]: I0318 11:21:00.147817 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:21:00 crc kubenswrapper[4778]: I0318 11:21:00.148572 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:21:07 crc kubenswrapper[4778]: E0318 11:21:07.189388 4778 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.147522 4778 patch_prober.go:28] interesting pod/machine-config-daemon-56rc7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.148581 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.148657 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.149897 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b0ac993b4f24855e183f727279ee7234cc746a4cd9d755c93e21afd7bbad379"} pod="openshift-machine-config-operator/machine-config-daemon-56rc7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.149961 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" podUID="7243f983-24d5-48ef-858b-5f4049a82acc" containerName="machine-config-daemon" containerID="cri-o://6b0ac993b4f24855e183f727279ee7234cc746a4cd9d755c93e21afd7bbad379" gracePeriod=600 Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.549938 4778 generic.go:334] "Generic (PLEG): container finished" podID="7243f983-24d5-48ef-858b-5f4049a82acc" containerID="6b0ac993b4f24855e183f727279ee7234cc746a4cd9d755c93e21afd7bbad379" exitCode=0 Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.550031 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerDied","Data":"6b0ac993b4f24855e183f727279ee7234cc746a4cd9d755c93e21afd7bbad379"} Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.550412 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56rc7" event={"ID":"7243f983-24d5-48ef-858b-5f4049a82acc","Type":"ContainerStarted","Data":"ec3dcc697dc177d9f083b60be35d5ffbf66ca818a495e0794a59896ab1952779"} Mar 18 11:21:30 crc kubenswrapper[4778]: I0318 11:21:30.550445 4778 scope.go:117] "RemoveContainer" containerID="1567fdd1bbcf109548ba8c5d5dd5314acaa1b9f819fd4648c2af2a0ad2fef434" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515156505320024447 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015156505321017365 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015156464437016524 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015156464437015474 5ustar corecore